CN113034585A - Offset state test method, test device and storage medium - Google Patents

Offset state test method, test device and storage medium Download PDF

Info

Publication number
CN113034585A
CN113034585A CN202110450740.5A CN202110450740A CN113034585A CN 113034585 A CN113034585 A CN 113034585A CN 202110450740 A CN202110450740 A CN 202110450740A CN 113034585 A CN113034585 A CN 113034585A
Authority
CN
China
Prior art keywords
position information
offset state
determining
test
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110450740.5A
Other languages
Chinese (zh)
Other versions
CN113034585B (en
Inventor
徐振宾
徐瑞兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Optical Technology Co Ltd
Original Assignee
Goertek Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Inc filed Critical Goertek Inc
Priority to CN202110450740.5A priority Critical patent/CN113034585B/en
Publication of CN113034585A publication Critical patent/CN113034585A/en
Application granted granted Critical
Publication of CN113034585B publication Critical patent/CN113034585B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The invention discloses an offset state testing method, testing equipment and a computer readable storage medium. The method comprises the following steps: acquiring a first test image corresponding to a first projection machine of augmented reality equipment, wherein the first test image comprises a test pattern projected by the first projection machine; acquiring position parameters and gray values of preset pixel points in the test image, and determining position information corresponding to the test pattern according to the position parameters and gray values of the preset pixel points; and determining the offset state of the first projection light machine according to the position information. The invention aims to achieve the effect of improving the accuracy of the offset test.

Description

Offset state test method, test device and storage medium
Technical Field
The present invention relates to the field of augmented reality device manufacturing technologies, and in particular, to an offset state testing method, testing device, and computer-readable storage medium.
Background
With the progress of science and technology, the development of AR (Augmented Reality) technology is becoming mature. AR devices have become increasingly popular. In the related art, in order to allow a user to see virtual information based on an AR device, a virtual image may be projected into human eyes through a waveguide sheet.
In the related art, the left-eye projection light machine and the right-eye projection light machine of the AR apparatus are independently provided. In order to prevent the positions of the virtual objects seen by two eyes from deviating relative to the same real object when the AR device is used, the AR device is required to be assembled, and the position relationship between the Light Engine and the support is adjusted.
In the related art, in order to accurately adjust the position between the optical machines and the support, the projection offset states of the two optical machines need to be determined first, and a test image including a projected test pattern and a target pattern can be captured by a camera. However, the projection resolution of the projection light machine is generally lower than that of the camera, so that a plurality of pixel points in the test image correspond to one pixel point of the projection image, the line profile of the test image in the test image is thick, the center of the line profile of the test image cannot be determined, and the comparison with the target image cannot be accurately performed. Thus, the test result is inaccurate.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The present invention provides a method, a device and a computer readable storage medium for testing an offset state, which are capable of improving the accuracy of an offset test.
In order to achieve the above object, the present invention provides an offset state testing method, including the steps of:
acquiring a first test image corresponding to a first projection machine of augmented reality equipment, wherein the first test image comprises a test pattern projected by the first projection machine;
acquiring position parameters and gray values of preset pixel points in the test image, and determining position information corresponding to the test pattern according to the position parameters and gray values of the preset pixel points;
and determining the offset state of the first projection light machine according to the position information.
Optionally, the test pattern is a cross-shaped pattern comprising a first profile in the horizontal direction and a second profile in the vertical direction.
Optionally, before the step of obtaining the position parameters and the gray values of the preset pixels in the test image and determining the position information corresponding to the test pattern according to the position parameters and the gray values of the plurality of preset pixels, the method further includes:
taking pixel points corresponding to a first pixel column and a second pixel column in a first preset area as preset pixel points; and/or the presence of a gas in the gas,
and taking pixel points corresponding to the first pixel row and the second pixel row in the second preset area as the preset pixel points.
Optionally, the step of obtaining the position parameters and the gray values of the preset pixel points in the test image, and determining the position information corresponding to the test pattern according to the position parameters and the gray values of the plurality of preset pixel points includes:
acquiring position parameters and gray values of all the preset pixel points on the first pixel column and the second pixel column;
determining first position information corresponding to the test pattern according to the position parameters and the gray values of all the preset pixel points on the first pixel column, and determining second position information corresponding to the test pattern according to the position parameters and the gray values of all the preset pixel points on the second pixel column; and/or the presence of a gas in the gas,
and determining third position information corresponding to the test pattern according to the position parameters and the gray values of the preset pixel points on the first pixel row, and determining fourth position information corresponding to the test pattern according to the position parameters and the gray values of the preset pixel points on the second pixel row.
Optionally, before the step of determining the offset state of the first projector according to the position information, the method further includes:
acquiring standard position information;
the step of determining the offset state of the first projector according to the position information includes:
and determining the offset state of the first projection optical machine according to the difference value between the position information and the standard position information.
Optionally, the step of acquiring the standard location information includes:
acquiring the stored standard position information; alternatively, the first and second electrodes may be,
and acquiring projection position information of a second projection light machine of the augmented reality equipment as the standard position information.
Optionally, after the step of determining the offset state of the first projector according to the position information, the method further includes:
determining an adjusting parameter of the first projector according to the offset state;
outputting the adjusting parameters; and/or the presence of a gas in the gas,
and sending the adjusting parameters to adjusting equipment so that the adjusting equipment can adjust the installation position of the first projection machine according to the adjusting parameters.
Optionally, after the step of determining the offset state of the first projector according to the position information, the method further includes:
and outputting the offset state.
In addition, to achieve the above object, the present invention further provides a testing apparatus, which includes a memory, a processor, and an offset state testing program stored on the memory and executable on the processor, and when the offset state testing program is executed by the processor, the steps of the offset state testing method as described above are implemented.
Further, to achieve the above object, the present invention also provides a computer readable storage medium having stored thereon an offset state test program which, when executed by a processor, implements the steps of the offset state test method as described above.
The embodiment of the invention provides an offset state testing method, testing equipment and a computer readable storage medium, which are used for obtaining a first testing image corresponding to a first projector of augmented reality equipment, wherein the first testing image comprises a testing pattern projected by the first projector, obtaining position parameters and gray values of preset pixel points in the testing image, determining position information corresponding to the testing pattern according to the position parameters and the gray values of a plurality of the preset pixel points, and determining the offset state of the first projector according to the position information. The center of the line profile of the test pattern can be determined according to the gray value, so that the difference between the projection pattern and the standard position can be accurately determined, and the effect of improving the accuracy of the offset test is achieved.
Drawings
Fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating an offset testing method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a test image according to an embodiment of the present invention;
FIG. 4 is a graph showing a relationship between pixel position parameters and gray-level values according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating standard position parameters according to an embodiment of the present invention;
FIG. 6 is a flow chart illustrating a method for testing an offset condition according to another embodiment of the present invention;
FIG. 7 is a flowchart illustrating a method for testing an offset condition according to another embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the related art, in order to accurately adjust the position between the optical machines and the support, the projection offset states of the two optical machines need to be determined first, and a test image including a projected test pattern and a target pattern can be generally shot by a camera. However, the projection resolution of the projection light machine is generally lower than that of the camera, so that a plurality of pixel points in the test image correspond to one pixel point of the projection image, the line profile of the test image in the test image is thick, the center of the line profile of the test image cannot be determined, and the comparison with the target image cannot be accurately performed. Thus, the test result is inaccurate.
In order to solve the above-mentioned drawbacks, an embodiment of the present invention provides an offset state testing method, a testing device, and a computer-readable storage medium, where the method mainly includes the following steps:
acquiring a first test image corresponding to a first projection machine of augmented reality equipment, wherein the first test image comprises a test pattern projected by the first projection machine;
acquiring position parameters and gray values of preset pixel points in the test image, and determining position information corresponding to the test pattern according to the position parameters and gray values of the preset pixel points;
and determining the offset state of the first projection light machine according to the position information.
The center of the line profile of the test pattern can be determined according to the gray value, so that the difference between the projection pattern and the standard position can be accurately determined, and the effect of improving the accuracy of the offset test is achieved.
As shown in fig. 1, fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention.
The terminal of the embodiment of the invention can be testing equipment such as a PC and the like.
As shown in fig. 1, the terminal may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), a mouse, etc., and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the terminal structure shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an offset state test program.
In the terminal shown in fig. 1, the network interface 1004 is mainly used for connecting to a backend server and performing data communication with the backend server; the processor 1001 may be configured to invoke an offset state test program stored in the memory 1005 and perform the following operations:
acquiring a first test image corresponding to a first projection machine of augmented reality equipment, wherein the first test image comprises a test pattern projected by the first projection machine;
acquiring position parameters and gray values of preset pixel points in the test image, and determining position information corresponding to the test pattern according to the position parameters and gray values of the preset pixel points;
and determining the offset state of the first projection light machine according to the position information.
Further, the processor 1001 may call the offset state test program stored in the memory 1005, and also perform the following operations:
taking pixel points corresponding to a first pixel column and a second pixel column in a first preset area as preset pixel points; and/or the presence of a gas in the gas,
and taking pixel points corresponding to the first pixel row and the second pixel row in the second preset area as the preset pixel points.
Further, the processor 1001 may call the offset state test program stored in the memory 1005, and also perform the following operations:
acquiring position parameters and gray values of all the preset pixel points on the first pixel column and the second pixel column;
determining first position information corresponding to the test pattern according to the position parameters and the gray values of all the preset pixel points on the first pixel column, and determining second position information corresponding to the test pattern according to the position parameters and the gray values of all the preset pixel points on the second pixel column; and/or the presence of a gas in the gas,
and determining third position information corresponding to the test pattern according to the position parameters and the gray values of the preset pixel points on the first pixel row, and determining fourth position information corresponding to the test pattern according to the position parameters and the gray values of the preset pixel points on the second pixel row.
Further, the processor 1001 may call the offset state test program stored in the memory 1005, and also perform the following operations:
acquiring standard position information;
and determining the offset state of the first projection optical machine according to the difference value between the position information and the standard position information.
Further, the processor 1001 may call the offset state test program stored in the memory 1005, and also perform the following operations:
acquiring the stored standard position information; alternatively, the first and second electrodes may be,
and acquiring projection position information of a second projection light machine of the augmented reality equipment as the standard position information.
Further, the processor 1001 may call the offset state test program stored in the memory 1005, and also perform the following operations:
determining an adjusting parameter of the first projector according to the offset state;
outputting the adjusting parameters; and/or the presence of a gas in the gas,
and sending the adjusting parameters to adjusting equipment so that the adjusting equipment can adjust the installation position of the first projection machine according to the adjusting parameters.
Further, the processor 1001 may call the offset state test program stored in the memory 1005, and also perform the following operations:
and outputting the offset state.
Referring to fig. 2, in an embodiment of the offset testing method of the present invention, the offset testing method includes the following steps:
step S10, acquiring a first test image corresponding to a first projector of the augmented reality device, wherein the first test image comprises a test pattern projected by the first projector;
step S20, obtaining position parameters and gray values of preset pixel points in the test image, and determining position information corresponding to the test pattern according to the position parameters and gray values of the preset pixel points;
and step S30, determining the offset state of the first projector according to the position information.
With the progress of science and technology, the development of AR (Augmented Reality) technology is becoming mature. AR devices have become increasingly popular. In the related art, in order to allow a user to see virtual information based on an AR device, a virtual image may be projected into human eyes through a waveguide sheet.
The left eye projection engine and the right eye projection engine of the AR device are independently set. In order to prevent the positions of the virtual objects seen by two eyes from deviating relative to the same real object when the AR device is used, the AR device is required to be assembled, and the position relationship between the Light Engine and the support is adjusted.
In the related art, in order to accurately adjust the position between the optical machines and the support, the projection offset states of the two optical machines need to be determined first, and a test image including a projected test pattern and a target pattern can be captured by a camera. However, the projection resolution of the projection light machine is generally lower than that of the camera, so that a plurality of pixel points in the test image correspond to one pixel point of the projection image, the line profile of the test image in the test image is thick, the center of the line profile of the test image cannot be determined, and the comparison with the target image cannot be accurately performed. Thus, the defect of inaccurate test result exists
In order to solve the above-mentioned defects in the related art, an embodiment of the present invention provides a testing method, which is applied to a testing terminal.
In this embodiment, a first test image corresponding to the first projector of the augmented reality device may be obtained first. It will be appreciated that AR devices are typically provided with two separate light projectors, one for each of left and right eye projection. The first projector may be a left projector or a right projector of the AR device.
For example, in the present embodiment, the test terminal may be communicatively connected to the photographing apparatus. The photographing device is used for photographing the projection pattern of the first projection light machine. That is, the test image photographed by the photographing device corresponds to a projection pattern observed by human eyes at a position of 4M. Furthermore, the testing pattern can be projected by the projector.
After the first projection light machine projects a corresponding test pattern, the projection pattern can be shot by a camera device arranged at a preset position, namely, a first test image corresponding to the first projection light machine is collected by the camera device. And sending the first test image to a test terminal. The test terminal is configured to receive the first test image sent by the first camera device. The first test image comprises a test pattern projected by the first projector.
It will be appreciated that the camera may be calibrated prior to capturing the test image so that the exact center of the test image is aligned with the exact center of the target pattern. Therefore, when the projection position of the light engine is accurate (i.e. no offset occurs), the position of the projection pattern is at the standard position. For example, when the projected pattern is a cross pattern, the center of the cross pattern is at the center of the test image, and the first contour in the horizontal direction and the second contour in the vertical direction of the cross pattern are perpendicular to each other and to the orientation standard.
After the first test image is obtained, obtaining position parameters and gray values of preset pixel points in the test image, and determining position information corresponding to the test pattern according to the position parameters and gray values of the preset pixel points.
Referring to fig. 3, the test pattern is provided as a cross pattern. The outline of the test pattern refers to a first pattern 11 in a first direction (where the first pattern 11 is a line, and in the picture data, the first pattern 11 is a straight line having a width of a plurality of pixels) and a first pattern 12 in a second direction (similarly, in the image, the second pattern 12 may be a straight line having a width of a plurality of pixels) which constitute the cross-shaped pattern. Further, pixel points corresponding to a first pixel column and a second pixel column in a first preset region can be used as the preset pixel points; and/or taking pixel points corresponding to the first pixel line and the second pixel line in the second preset region as the preset pixel points. So that the position parameters and the gray values of the preset pixel points on the first pixel column and the second pixel column can be obtained; determining first position information corresponding to the test pattern according to the position parameters and the gray values of all the preset pixel points on the first pixel column, and determining second position information corresponding to the test pattern according to the position parameters and the gray values of all the preset pixel points on the second pixel column; and/or determining third position information corresponding to the test pattern according to the position parameters and the gray values of the preset pixel points on the first pixel row, and determining fourth position information corresponding to the test pattern according to the position parameters and the gray values of the preset pixel points on the second pixel row.
It should be noted that the first preset area and the second preset area are areas that can be set by a user, and the positions of the first pixel column, the second pixel column, the first pixel row and the second pixel row can also be set by a user. To accurately locate the cross-shaped test pattern in the test image, rows or columns of pixels near the edges of the image are typically selected. Of course, in some embodiments, instead of defining the predetermined area, the complete pixel rows and/or pixel columns in the test image may be directly selected to determine the location of the cross-shaped test pattern based on the complete pixel rows and pixel columns, but this may increase the interference and improve the throughput.
For a better understanding of the present solution, specific exemplary embodiments are given below.
Example 1, position parameters and corresponding gray values of a first pixel column and a second pixel column in a first preset area are acquired. Further, according to the relationship between the gray value and the fitting position parameter, as shown in fig. 4, where a curve 1 is a relationship between the position parameter of the pixel point on the first pixel row and the gray value, and a curve 2 is a relationship between the position parameter of the pixel point on the second pixel row and the gray value. Wherein the ordinate is the gray value and the abscissa is the position parameter. The position parameter may be a relative position of the pixel point in the test image. For example, the position parameter of the pixel B on the first pixel column in fig. 3 may be a separation distance between the pixel a and the pixel B (the minimum unit may be a pixel width). And then taking the position parameters of the pixel points corresponding to the peak positions in the curve 1 and the curve 2 as first position information and second position information corresponding to the test pattern.
In this embodiment, based on the imaging principle of the camera, the position where the gray-scale value is the largest in the captured test pattern is the center position of the test pattern in the pixel row, so that the position parameters of the pixel point corresponding to the peak position can be used as the first position information and the second position information corresponding to the test pattern.
Example 2, in this example, after the position parameters and the corresponding gray values of the first pixel column and the second pixel column in the first preset region are obtained, the size of the gray value may be directly sorted, then the pixel point with the maximum gray value is determined according to the sorting result, and or the position parameter of the pixel point with the maximum gray value is used as the first position information and the second position information corresponding to the test pattern. It will be appreciated that the first pixel column and the second pixel class are processed separately and thus the first location and the second location can be determined.
Further, after the position parameters and the gray values of the preset pixel points in the test image are obtained, and the position information corresponding to the test pattern is determined according to the position parameters and the gray values of the plurality of preset pixel points, the offset state of the first projector can be determined according to the position information.
Exemplarily, the position parameters and the gray values of the preset pixel points in the test image are obtained, and after the position information corresponding to the test pattern is determined according to the position parameters and the gray values of the plurality of preset pixel points, the standard position information can be obtained, wherein the standard position information can be the position of the target pattern. And the position of the target pattern is a position that can be corrected in advance. Therefore, the standard position information can be saved in advance. Referring to fig. 5, the standard position may be set as a position where the target pattern is located in the test image, as indicated by a dotted line. After the first position information and the second position information are determined, the interval distance between the first position information and the second position information can be acquired, and the offset state is determined according to the interval distance. Wherein the offset state includes an offset direction and an offset amount. As shown in fig. 5, the projected position is shifted downward (shift direction) in the horizontal direction by a vertical interval from the standard position.
It should be noted that, in some embodiments, the projection position of the second projector may also be used as the standard position information. And further determining the offset state of the projection position between the first projection light machine and the second projection light machine. It can be understood that, when determining the standard position parameter according to the projection position of the second light projector, the method may be performed based on the manner of determining the position information of the test pattern projected by the first light projector in this embodiment.
In the technical scheme disclosed in this embodiment, a first test image corresponding to a first projection machine of an augmented reality device is obtained, where the first test image includes a test pattern projected by the first projection machine, a position parameter and a gray value of a preset pixel point in the test image are obtained, position information corresponding to the test pattern is determined according to the position parameters and the gray values of a plurality of preset pixel points, and an offset state of the first projection machine is determined according to the position information. The center of the line profile of the test pattern can be determined according to the gray value, so that the difference between the projection pattern and the standard position can be accurately determined, and the effect of improving the accuracy of the offset test is achieved.
Optionally, referring to fig. 6, based on the foregoing embodiment, in another embodiment, after the step S30, the method further includes:
step S40, determining the adjusting parameter of the first projector according to the offset state;
step S50, outputting the adjusting parameters; and/or sending the adjusting parameter to adjusting equipment so that the adjusting equipment can adjust the installation position of the first projection machine according to the adjusting parameter.
In this embodiment, the shift state includes a shift direction and a shift amount, and after the shift state is determined, the adjustment parameter of the first projector engine can be determined according to the shift state. Wherein the adjustment parameter includes an adjustment direction and an adjustment amount, the adjustment direction is opposite to the offset direction, and the adjustment amount is equal to the offset amount.
Further, the test device may be provided with a user interface. The user interface may be a display or other human interaction device and/or means. And enabling the testing equipment to output the adjusting parameters through the user interface so that a tester can adjust the installation position of the first projection machine according to the adjusting parameters.
Optionally, the testing device may be further configured to be in communication with the adjustment device, after determining the adjustment parameter. The testing equipment can send the adjusting parameter to adjusting equipment so that the adjusting equipment can adjust the installation position of the first projection machine according to the adjusting parameter.
In the technical solution disclosed in this embodiment, an adjustment parameter of the first projection optical engine may be determined according to the offset state, and the adjustment parameter is output; and/or sending the adjusting parameter to adjusting equipment so that the adjusting equipment can adjust the installation position of the first projection machine according to the adjusting parameter. Therefore, the effect of correcting the position of the first projection light machine is improved.
Optionally, referring to fig. 7, based on the foregoing embodiment, in another embodiment, after the step S30, the method further includes:
and step S60, outputting the offset state.
In this embodiment, the test device may be provided with a user interface, so that the test device may interact with a user through the user interface. For example, the user interface may be a display screen, such that the test device may display the offset state in the display screen after determining the offset state.
Optionally, in some embodiments, the test device further comprises a network interface. When the test equipment tests the augmented reality equipment, the identification mark of the augmented reality equipment can be acquired firstly. The identification mark may be a product mark of the augmented reality device, and the production label and the like may be distinguished from identification information of other augmented reality devices except the augmented reality device itself. After the identification mark of the augmented reality device is obtained, the identification mark and the offset state can be associated and then sent to a server. After receiving the identification and the offset state, the server may store the association. So that the test data can be shared by the remote testers.
Optionally, in some embodiments, after receiving the identification identifier and the offset state, the server may further obtain a sending time point when the test device sends the identification identifier and the offset state, or receive a receiving time point when the server receives the identification identifier and the offset state, and then store the identification identifier in association with the offset state, the sending time point, or the test time point. So that the tester can determine the historical test data according to the time point.
In the technical solution disclosed in this embodiment, the offset state may be output in various ways, thereby improving the diversity of data output ways.
In addition, an embodiment of the present invention further provides a testing apparatus, where the testing apparatus includes a memory, a processor, and an offset state testing program that is stored on the memory and is executable on the processor, and when the offset state testing program is executed by the processor, the steps of the offset state testing method according to the above embodiments are implemented.
In addition, an embodiment of the present invention further provides a computer-readable storage medium, where an offset state testing program is stored on the computer-readable storage medium, and when the offset state testing program is executed by a processor, the steps of the offset state testing method described in the above embodiments are implemented.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a testing apparatus (which may be a PC or the like) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An offset state testing method, characterized by comprising the steps of:
acquiring a first test image corresponding to a first projection machine of augmented reality equipment, wherein the first test image comprises a test pattern projected by the first projection machine;
acquiring position parameters and gray values of preset pixel points in the test image, and determining position information corresponding to the test pattern according to the position parameters and gray values of the preset pixel points;
and determining the offset state of the first projection light machine according to the position information.
2. The offset state testing method according to claim 1, wherein the test pattern is a cross-shaped pattern including a first profile in a horizontal direction and a second profile in a vertical direction.
3. The method for testing an offset state according to claim 1, wherein before the step of obtaining the position parameters and the gray values of the preset pixels in the test image and determining the position information corresponding to the test pattern according to the position parameters and the gray values of the preset pixels, the method further comprises:
taking pixel points corresponding to a first pixel column and a second pixel column in a first preset area as preset pixel points; and/or the presence of a gas in the gas,
and taking pixel points corresponding to the first pixel row and the second pixel row in the second preset area as the preset pixel points.
4. The method according to claim 3, wherein the step of obtaining the position parameters and gray values of the preset pixels in the test image and determining the position information corresponding to the test pattern according to the position parameters and gray values of the preset pixels comprises:
acquiring position parameters and gray values of all the preset pixel points on the first pixel column and the second pixel column;
determining first position information corresponding to the test pattern according to the position parameters and the gray values of all the preset pixel points on the first pixel column, and determining second position information corresponding to the test pattern according to the position parameters and the gray values of all the preset pixel points on the second pixel column; and/or the presence of a gas in the gas,
and determining third position information corresponding to the test pattern according to the position parameters and the gray values of the preset pixel points on the first pixel row, and determining fourth position information corresponding to the test pattern according to the position parameters and the gray values of the preset pixel points on the second pixel row.
5. The offset state testing method of claim 1, wherein before the step of determining the offset state of the first projector according to the position information, the method further comprises:
acquiring standard position information;
the step of determining the offset state of the first projector according to the position information includes:
and determining the offset state of the first projection optical machine according to the difference value between the position information and the standard position information.
6. The offset state testing method of claim 5, wherein the step of acquiring the standard position information comprises:
acquiring the stored standard position information; alternatively, the first and second electrodes may be,
and acquiring projection position information of a second projection light machine of the augmented reality equipment as the standard position information.
7. The offset state testing method of claim 1, wherein after the step of determining the offset state of the first projector according to the position information, the method further comprises:
determining an adjusting parameter of the first projector according to the offset state;
outputting the adjusting parameters; and/or the presence of a gas in the gas,
and sending the adjusting parameters to adjusting equipment so that the adjusting equipment can adjust the installation position of the first projection machine according to the adjusting parameters.
8. The offset state testing method of claim 1, wherein after the step of determining the offset state of the first projector according to the position information, the method further comprises:
and outputting the offset state.
9. A test apparatus, characterized in that the test apparatus comprises: memory, a processor and an offset state test program stored on the memory and executable on the processor, which when executed by the processor implements the steps of the offset state testing method of any of claims 1 to 8.
10. A computer-readable storage medium, having stored thereon an offset state testing program which, when executed by a processor, implements the steps of the offset state testing method of any of claims 1 to 8.
CN202110450740.5A 2021-04-25 2021-04-25 Offset state test method, test device and storage medium Active CN113034585B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110450740.5A CN113034585B (en) 2021-04-25 2021-04-25 Offset state test method, test device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110450740.5A CN113034585B (en) 2021-04-25 2021-04-25 Offset state test method, test device and storage medium

Publications (2)

Publication Number Publication Date
CN113034585A true CN113034585A (en) 2021-06-25
CN113034585B CN113034585B (en) 2023-02-28

Family

ID=76454541

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110450740.5A Active CN113034585B (en) 2021-04-25 2021-04-25 Offset state test method, test device and storage medium

Country Status (1)

Country Link
CN (1) CN113034585B (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1012515A (en) * 1996-06-20 1998-01-16 Nikon Corp Projection aligner
CN101666631A (en) * 2009-09-07 2010-03-10 东南大学 Three-dimensional measuring method based on positive and inverse code color encoding stripes
CN108769668A (en) * 2018-05-31 2018-11-06 歌尔股份有限公司 Method for determining position and device of the pixel in VR display screens in camera imaging
CN109521879A (en) * 2018-11-19 2019-03-26 网易(杭州)网络有限公司 Interactive projection control method, device, storage medium and electronic equipment
CN109557669A (en) * 2018-11-26 2019-04-02 歌尔股份有限公司 It wears the image drift method for determination of amount of display equipment and wears display equipment
CN109640066A (en) * 2018-12-12 2019-04-16 深圳先进技术研究院 The generation method and device of high-precision dense depth image
CN110703904A (en) * 2019-08-26 2020-01-17 深圳疆程技术有限公司 Augmented virtual reality projection method and system based on sight tracking
CN111458881A (en) * 2020-05-13 2020-07-28 歌尔科技有限公司 Display system and head-mounted display equipment
CN111476271A (en) * 2020-03-10 2020-07-31 杭州易现先进科技有限公司 Icon identification method, device, system, computer equipment and storage medium
CN111707456A (en) * 2020-08-20 2020-09-25 歌尔光学科技有限公司 Optical machine definition testing method and testing device
CN111738955A (en) * 2020-06-23 2020-10-02 安徽海微电光电科技有限责任公司 Distortion correction method and device for projected image and computer readable storage medium
CN111866481A (en) * 2020-09-22 2020-10-30 歌尔股份有限公司 Method for detecting contamination of projection device, detection device and readable storage medium
CN111896233A (en) * 2020-08-13 2020-11-06 歌尔光学科技有限公司 Contrast test method, contrast test apparatus, and storage medium
CN111896232A (en) * 2020-09-30 2020-11-06 歌尔光学科技有限公司 Optical machine module testing method, equipment, system and computer readable storage medium
CN112040208A (en) * 2020-09-09 2020-12-04 南昌虚拟现实研究院股份有限公司 Image processing method, image processing device, readable storage medium and computer equipment
CN112261394A (en) * 2020-10-20 2021-01-22 歌尔光学科技有限公司 Method, device and system for measuring deflection rate of galvanometer and computer storage medium
CN112326202A (en) * 2020-10-23 2021-02-05 歌尔光学科技有限公司 Binocular parallax testing method, device and tool of virtual reality equipment

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1012515A (en) * 1996-06-20 1998-01-16 Nikon Corp Projection aligner
CN101666631A (en) * 2009-09-07 2010-03-10 东南大学 Three-dimensional measuring method based on positive and inverse code color encoding stripes
CN108769668A (en) * 2018-05-31 2018-11-06 歌尔股份有限公司 Method for determining position and device of the pixel in VR display screens in camera imaging
CN109521879A (en) * 2018-11-19 2019-03-26 网易(杭州)网络有限公司 Interactive projection control method, device, storage medium and electronic equipment
CN109557669A (en) * 2018-11-26 2019-04-02 歌尔股份有限公司 It wears the image drift method for determination of amount of display equipment and wears display equipment
CN109640066A (en) * 2018-12-12 2019-04-16 深圳先进技术研究院 The generation method and device of high-precision dense depth image
CN110703904A (en) * 2019-08-26 2020-01-17 深圳疆程技术有限公司 Augmented virtual reality projection method and system based on sight tracking
CN111476271A (en) * 2020-03-10 2020-07-31 杭州易现先进科技有限公司 Icon identification method, device, system, computer equipment and storage medium
CN111458881A (en) * 2020-05-13 2020-07-28 歌尔科技有限公司 Display system and head-mounted display equipment
CN111738955A (en) * 2020-06-23 2020-10-02 安徽海微电光电科技有限责任公司 Distortion correction method and device for projected image and computer readable storage medium
CN111896233A (en) * 2020-08-13 2020-11-06 歌尔光学科技有限公司 Contrast test method, contrast test apparatus, and storage medium
CN111707456A (en) * 2020-08-20 2020-09-25 歌尔光学科技有限公司 Optical machine definition testing method and testing device
CN112040208A (en) * 2020-09-09 2020-12-04 南昌虚拟现实研究院股份有限公司 Image processing method, image processing device, readable storage medium and computer equipment
CN111866481A (en) * 2020-09-22 2020-10-30 歌尔股份有限公司 Method for detecting contamination of projection device, detection device and readable storage medium
CN111896232A (en) * 2020-09-30 2020-11-06 歌尔光学科技有限公司 Optical machine module testing method, equipment, system and computer readable storage medium
CN112261394A (en) * 2020-10-20 2021-01-22 歌尔光学科技有限公司 Method, device and system for measuring deflection rate of galvanometer and computer storage medium
CN112326202A (en) * 2020-10-23 2021-02-05 歌尔光学科技有限公司 Binocular parallax testing method, device and tool of virtual reality equipment

Also Published As

Publication number Publication date
CN113034585B (en) 2023-02-28

Similar Documents

Publication Publication Date Title
CN105407342B (en) The calibration method of image calibration system and stereo camera
TWI484283B (en) Image measurement method, image measurement apparatus and image inspection apparatus
CN101655980A (en) Image capture, alignment, and registration
CN109635639B (en) Method, device, equipment and storage medium for detecting position of traffic sign
CN108986721B (en) Detection graph generation method for display panel detection
CN110769229A (en) Method, device and system for detecting color brightness of projection picture
JP2016220198A (en) Information processing device, method, and program
CN112261394B (en) Method, device and system for measuring deflection rate of galvanometer and computer storage medium
JP2018124441A (en) System, information processing apparatus, information processing method, and program
CN111896233A (en) Contrast test method, contrast test apparatus, and storage medium
CN113034585B (en) Offset state test method, test device and storage medium
KR102585556B1 (en) Apparatus for testing camera image distortion and method thereof
JP2020194998A (en) Control arrangement, projection system, control method, program and storage medium
CN113781414A (en) Lens resolving power testing method and device and electronic equipment
CN105427315B (en) Digital instrument image position testing method and device
CN108848358B (en) Method and device for correcting color convergence errors
CN111857623A (en) Calibration apparatus, calibration system, and display apparatus calibration method
CN108200043B (en) Picture verification code verification method and picture verification code verification device
CN116499362A (en) Steel plate size online measurement system
CN109839061B (en) Lens module detection method and system
CN112308933B (en) Method and device for calibrating camera internal reference and computer storage medium
JP2007033040A (en) Method and device for calibrating optical head part in three-dimensional shape measuring instrument by optical cutting method
JP2013190281A (en) Installation state detection system, installation state detection apparatus and installation state detection method
CN113012182B (en) Offset state testing method, testing device and storage medium
CN105486227A (en) Font size test method and device for digital instrument

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20221118

Address after: 261031 workshop 1, phase III, Geer Photoelectric Industrial Park, 3999 Huixian Road, Yongchun community, Qingchi street, high tech Zone, Weifang City, Shandong Province

Applicant after: GoerTek Optical Technology Co.,Ltd.

Address before: 261031 No. 268 Dongfang Road, hi tech Industrial Development Zone, Shandong, Weifang

Applicant before: GOERTEK Inc.

GR01 Patent grant
GR01 Patent grant