CN117197399A - Method and device for testing asynchronous time warping function - Google Patents

Method and device for testing asynchronous time warping function Download PDF

Info

Publication number
CN117197399A
CN117197399A CN202210609139.0A CN202210609139A CN117197399A CN 117197399 A CN117197399 A CN 117197399A CN 202210609139 A CN202210609139 A CN 202210609139A CN 117197399 A CN117197399 A CN 117197399A
Authority
CN
China
Prior art keywords
augmented reality
asynchronous time
pose
function
distortion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210609139.0A
Other languages
Chinese (zh)
Inventor
盛崇山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shangtang Technology Development Co Ltd
Original Assignee
Zhejiang Shangtang Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shangtang Technology Development Co Ltd filed Critical Zhejiang Shangtang Technology Development Co Ltd
Priority to CN202210609139.0A priority Critical patent/CN117197399A/en
Publication of CN117197399A publication Critical patent/CN117197399A/en
Pending legal-status Critical Current

Links

Abstract

The present disclosure provides a method and apparatus for testing an asynchronous time warp function, wherein the method includes: acquiring a distortion test result of an asynchronous time distortion function of an augmented reality system in the working process of the augmented reality system of augmented reality equipment; performance parameter information of the asynchronous time warp function is determined based on the warp test result.

Description

Method and device for testing asynchronous time warping function
Technical Field
The present disclosure relates to the field of computer technology, and in particular, to a method and apparatus for testing an asynchronous time warping function.
Background
In the field of augmented reality, an augmented reality device used by a user needs to continuously acquire images through an image acquisition device, render the images according to the acquired images and a current pose, and generate a picture for display. In order to prevent a large error of a rendered picture caused by too fast change of the pose acquired by the image acquisition device, an asynchronous time warping function (Asynchronous Timewarp, ATW) is generally adopted to predict the pose of the image acquisition device, and rendering is performed according to the acquired image and the predicted pose, so as to generate the picture to be displayed.
Since the asynchronous time warping function is an underlying algorithm, the user can only feel the effect of the asynchronous time warping from the level director of other displays, and the applicant of the present application finds that no test method for the asynchronous time warping function exists at present in the long-term research and development process, and it is difficult to determine whether the asynchronous time warping function meets the use requirement.
Disclosure of Invention
The embodiment of the disclosure at least provides a test method and a test device for an asynchronous time warping function.
In a first aspect, an embodiment of the present disclosure provides a method for testing an asynchronous time warp function, including:
acquiring a distortion test result of an asynchronous time distortion function of an augmented reality system in the working process of the augmented reality system of augmented reality equipment;
performance parameter information of the asynchronous time warp function is determined based on the warp test result.
In the method, the distortion test result of the asynchronous time distortion function can be obtained in the working process of the augmented reality system, and the performance parameter information for representing the asynchronous time distortion function is determined based on the predicted pose, the augmented reality image and other data in the distortion test result. By adopting the method, the performance of the asynchronous time warping function can be objectively reflected through the performance parameters, so that the performance evaluation of the asynchronous time warping function is more accurate.
In one possible implementation, the obtaining the distortion test result of the asynchronous time distortion function of the augmented reality system includes:
and acquiring a predicted pose of the augmented reality device, and acquiring an augmented reality image corresponding to the predicted pose.
In one possible implementation, the obtaining the predicted pose of the augmented reality device includes:
acquiring translation information and rotation information of the augmented reality equipment;
and determining the predicted pose of the augmented reality equipment at least one preset moment based on the translation information, the rotation information and the current pose information of the augmented reality equipment.
By adopting the method, the pose can be predicted from multiple dimensions, so that the pose of the augmented reality device can be accurately predicted.
In a possible implementation manner, the determining performance parameter information of the asynchronous time warping function based on the warping test result includes:
performing performance analysis on the distortion test result to determine performance parameter information of the asynchronous time distortion function under various performance parameters; the performance parameters include one or more of a rendering frame rate, a pose translation error, and a pose rotation error of an augmented reality video composed of multiple frames of the augmented reality image.
By adopting the method, the working efficiency of the augmented reality equipment for rendering the augmented reality video and the accuracy of predicting the pose can be evaluated.
In one possible embodiment, the distortion test results comprise at least one set of test results; the augmented reality image comprises at least one virtual object;
each group of test results is obtained by processing test resources corresponding to the virtual object based on the asynchronous time warping function, and at least one of the augmented reality equipment corresponding to the test results, the formats of the test resources, the complexity of the test resources and the types of the test resources in different groups is different.
By adopting the method, the performance of the asynchronous distortion function of the augmented reality device can be tested from multiple dimensions, so that the performance of the asynchronous distortion function of the augmented reality device can be more comprehensively reflected from the multiple dimensions.
In one possible implementation, the acquiring the augmented reality image corresponding to the predicted pose includes:
detecting the illumination value and the illumination color value of a real scene where the augmented reality equipment is located;
and rendering the virtual scene corresponding to the predicted pose based on the illumination value and the illumination color value to obtain the augmented reality image.
By adopting the method, the rays in the rendered virtual scene are close to the rays in the real scene, so that the virtual scene is better fused with the real scene, and the rendered augmented reality image is more vivid.
In a possible implementation, where the performance parameters include a pose translation error and a pose rotation error, the determining the performance parameter information of the asynchronous time warp function based on the warp test result includes:
and acquiring a true position and a pose of the augmented reality equipment, and determining a pose translation error and a pose rotation error of the augmented reality equipment based on the true position and the predicted pose.
By adopting the method, the difference between the predicted pose predicted by the augmented reality equipment and the true pose can be accurately calculated, so that the performance of the predicted pose of the augmented reality equipment can be accurately judged.
In a possible implementation manner, the determining performance parameter information of the asynchronous time warping function includes:
rendering an augmented reality image corresponding to the predicted pose;
evaluating the augmented reality image rendered by the augmented reality device according to a preset evaluation strategy to obtain a first prediction score; and/or receiving a second prediction score input by a user, wherein the second prediction score is obtained by evaluation based on the preset evaluation strategy;
And taking the first prediction score and/or the second prediction score as the performance parameter information.
By adopting the method, the first prediction score generated from the data analysis angle and the second prediction score input from the user use angle can be obtained, so that the performance of the asynchronous distortion function of the augmented reality device is more comprehensively reflected based on the first prediction score and the second prediction score.
In a possible implementation, the method of testing the asynchronous time warp function is performed before the asynchronous time warp function is integrated in the application;
and/or the asynchronous time warping function is for integration in an augmented reality application.
Here, the asynchronous time warping function is decoupled from other modules of the application program, and compared with the performance of the asynchronous time warping function in the application program tested by using the integrated application program, the asynchronous time warping function is tested only, so that the influence of the other modules of the application program on rendering can be reduced, the performance of the asynchronous time warping function can be accurately determined, and the positioning analysis of the problems obtained by the testing is reduced.
In one possible embodiment, the method further comprises:
Acquiring evaluation conditions corresponding to the performance parameter information;
and determining a test result of the asynchronous distortion function of the augmented reality device based on the performance parameter information and the evaluation condition.
In one possible implementation, the augmented reality system operation of the augmented reality device includes the augmented reality device performing the asynchronous time warping function while following head movement;
the obtaining the distortion test result of the asynchronous time distortion function of the augmented reality system comprises the following steps:
and obtaining a distortion test result of an asynchronous time distortion function of the augmented reality system under various complexities.
In a second aspect, embodiments of the present disclosure further provide a test apparatus for an asynchronous time warp function, including:
the acquisition module is used for acquiring a distortion test result of an asynchronous time distortion function of the augmented reality system in the working process of the augmented reality system of the augmented reality device;
and the determining module is used for determining the performance parameter information of the asynchronous time warping function based on the warping test result.
In one possible implementation, the obtaining module, when obtaining a distortion test result of an asynchronous time distortion function of the augmented reality system, is configured to:
And acquiring a predicted pose of the augmented reality device, and acquiring an augmented reality image corresponding to the predicted pose.
In one possible implementation, the obtaining module, when obtaining the predicted pose of the augmented reality device, is configured to:
acquiring translation information and rotation information of the augmented reality equipment;
and determining the predicted pose of the augmented reality equipment at least one preset moment based on the translation information, the rotation information and the current pose information of the augmented reality equipment.
In a possible implementation, the determining module, when determining the performance parameter information of the asynchronous time warp function based on the warp test result, is configured to:
performing performance analysis on the distortion test result to determine performance parameter information of the asynchronous time distortion function under various performance parameters; the performance parameters include one or more of a rendering frame rate, a pose translation error, and a pose rotation error of an augmented reality video composed of multiple frames of the augmented reality image.
In one possible embodiment, the distortion test results comprise at least one set of test results; the augmented reality image comprises at least one virtual object;
Each group of test results is obtained by processing test resources corresponding to the virtual object based on the asynchronous time warping function, and at least one of the augmented reality equipment corresponding to the test results, the formats of the test resources, the complexity of the test resources and the types of the test resources in different groups is different.
In one possible implementation, the acquiring module, when acquiring the augmented reality image corresponding to the predicted pose, is configured to:
detecting the illumination value and the illumination color value of a real scene where the augmented reality equipment is located;
and rendering the virtual scene corresponding to the predicted pose based on the illumination value and the illumination color value to obtain the augmented reality image.
In a possible implementation, in case the performance parameters comprise a pose translation error and a pose rotation error, the determining module is configured to, when determining the performance parameter information of the asynchronous time warp function based on the warp test result:
and acquiring a true position and a pose of the augmented reality equipment, and determining a pose translation error and a pose rotation error of the augmented reality equipment based on the true position and the predicted pose.
In a possible implementation, the determining module, when determining the performance parameter information of the asynchronous time warp function, is configured to:
rendering an augmented reality image corresponding to the predicted pose;
evaluating the augmented reality image rendered by the augmented reality device according to a preset evaluation strategy to obtain a first prediction score; and/or receiving a second prediction score input by a user, wherein the second prediction score is obtained by evaluation based on the preset evaluation strategy;
and taking the first prediction score and/or the second prediction score as the performance parameter information.
In a possible implementation, the method of testing the asynchronous time warp function is performed before the asynchronous time warp function is integrated in the application;
and/or the asynchronous time warping function is for integration in an augmented reality application.
In a possible implementation manner, the determining module is further configured to:
acquiring evaluation conditions corresponding to the performance parameter information;
and determining a test result of the asynchronous distortion function of the augmented reality device based on the performance parameter information and the evaluation condition.
In one possible implementation, the augmented reality system operation of the augmented reality device includes the augmented reality device performing the asynchronous time warping function while following head movement;
the acquisition module is used for, when acquiring a distortion test result of an asynchronous time distortion function of the augmented reality system:
and obtaining a distortion test result of an asynchronous time distortion function of the augmented reality system under various complexities.
In a third aspect, embodiments of the present disclosure further provide a computer device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory in communication via the bus when the computer device is running, the machine-readable instructions when executed by the processor performing the steps of the first aspect, or any of the possible implementations of the first aspect.
In a fourth aspect, the presently disclosed embodiments also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the first aspect, or any of the possible implementations of the first aspect.
The description of the effect of the test apparatus, the computer device, and the computer-readable storage medium for the asynchronous time warping function is referred to the description of the test method for the asynchronous time warping function, and is not repeated here.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the aspects of the disclosure.
The foregoing objects, features and advantages of the disclosure will be more readily apparent from the following detailed description of the preferred embodiments taken in conjunction with the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the technical aspects of the disclosure.
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for the embodiments are briefly described below, which are incorporated in and constitute a part of the specification, these drawings showing embodiments consistent with the present disclosure and together with the description serve to illustrate the technical solutions of the present disclosure. It is to be understood that the following drawings illustrate only certain embodiments of the present disclosure and are therefore not to be considered limiting of its scope, for the person of ordinary skill in the art may admit to other equally relevant drawings without inventive effort.
FIG. 1 illustrates a flow chart of a method of testing an asynchronous time warp function provided by an embodiment of the present disclosure;
FIG. 2 illustrates a schematic architecture of a test apparatus for asynchronous time warp functionality provided by embodiments of the present disclosure;
fig. 3 shows a schematic structural diagram of a computer device according to an embodiment of the disclosure.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, but not all embodiments. The components of the embodiments of the present disclosure, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure provided in the accompanying drawings is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be made by those skilled in the art based on the embodiments of this disclosure without making any inventive effort, are intended to be within the scope of this disclosure.
Augmented reality (Augmented Reality, AR) technology is a technology that fuses virtual information with the real world. The augmented reality device can acquire an initial image through the image acquisition device, render the initial image based on the pose corresponding to the initial image (namely, the pose of the augmented reality device when acquiring the initial image), for example, fuse a virtual model on the initial image, and then display the rendered augmented reality image. The augmented reality device may be, for example, an optical perspective device such as AR glasses, AR helmets, or the like.
Because the initial image needs a certain processing time when being rendered, when the pose of the augmented reality device changes greatly in a short time (such as rotating by 90 degrees), multiple frames of initial images cannot be timely rendered, and therefore a picture displayed by a display device of the augmented reality device is delayed greatly.
Thus, the asynchronous time warping function of the augmented reality system includes: and obtaining an initial image, predicting the predicted pose of the augmented reality equipment (the predicted pose corresponds to the initial image), rendering an augmented reality image based on the initial image and the predicted pose, and sending the augmented reality image to a display device for display. In this way, the augmented reality image at the next moment can be generated and displayed in advance, and delay is reduced to a certain extent.
Since the asynchronous time warping function is an underlying algorithm, the user can only feel the effect of the asynchronous time warping from the level director of other displays, and the applicant of the present application finds that no test method for the asynchronous time warping function exists at present in the long-term research and development process, and it is difficult to determine whether the asynchronous time warping function meets the use requirement.
Based on the above study, the disclosure provides a testing method and device for an asynchronous time warping function, which can obtain a warping test result of the asynchronous time warping function in a working process of an augmented reality system, and determine performance parameter information for representing the asynchronous time warping function based on data such as predicted pose, augmented reality image and the like in the warping test result. By adopting the method, the performance of the asynchronous time warping function can be objectively reflected through the performance parameters, so that the performance evaluation of the asynchronous time warping function is more accurate.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The term "and/or" is used herein to describe only one relationship, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist together, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
For the sake of understanding the present embodiment, first, a detailed description will be given of a method for testing an asynchronous time warping function disclosed in the present embodiment, where an execution body of the method for testing an asynchronous time warping function provided in the present embodiment is an electronic device, and the electronic device may be any device having processing capability, for example, a tablet computer, a mobile phone, a computer, or the like. In some possible implementations, the method of testing the asynchronous time warp function may be implemented by way of a processor invoking computer readable instructions stored in a memory.
Referring to fig. 1, a flowchart of a method for testing an asynchronous time warping function according to an embodiment of the present disclosure is shown, where the method includes steps 101 to 103, where:
Step 101, acquiring a distortion test result of an asynchronous time distortion function of an augmented reality system in the working process of the augmented reality system of augmented reality equipment;
step 102, determining performance parameter information of the asynchronous time warp function based on the warp test result.
The following is a detailed description of the above steps:
aiming at step 101,
In a possible implementation, the method of testing the asynchronous time warp function is performed before the asynchronous time warp function is integrated in the application; and/or the asynchronous time warping function is for integration in an augmented reality application.
If the method for testing the asynchronous time warping function is executed before the asynchronous time warping function is integrated in the application program, the augmented reality device may be a testing device, and the initial image may be rendered by starting the asynchronous warping function of the augmented reality system of the augmented reality device, and the asynchronous warping function may be tested based on the rendering result.
If the asynchronous time warping function is used for an application program integrated in the augmented reality, the augmented reality device can generate an augmented reality video based on a warping test result, and the asynchronous warping function can be tested based on the augmented reality video.
Here, the asynchronous time warping function is decoupled from other modules of the application program, and compared with the performance of the asynchronous time warping function in the application program tested by using the integrated application program, the asynchronous time warping function is tested only, so that the influence of the other modules of the application program on rendering can be reduced, the performance of the asynchronous time warping function can be accurately determined, and the positioning analysis of the problems obtained by the testing is reduced.
In one possible application scenario, the distortion test result may be obtained after the user wears the augmented reality device to perform head movement and position transformation in the test area.
In one possible implementation, when obtaining the distortion test result of the asynchronous time-distortion function of the augmented reality system, it may be to obtain a predicted pose of the augmented reality device, and obtain an augmented reality image corresponding to the predicted pose.
In one possible implementation manner, when the predicted pose of the augmented reality device is acquired, translation information and rotation information of the augmented reality device may be acquired first, and then the predicted pose of the augmented reality device at least one preset time point is determined based on the translation information, the rotation information and current pose information of the augmented reality device.
Wherein the translation information represents the pose change of the augmented reality device in a moving degree of freedom, the moving degree of freedom comprises moving along a front-back direction, a left-right direction and a top-bottom direction (or moving along an X axis, a Y axis and a Z axis in a world coordinate system), the rotation information is used for representing the pose change of the augmented reality device in a rotating degree of freedom, and the rotating degree of freedom comprises an azimuth angle, a pitch angle and a roll angle (or the rotation information is rotating along the X axis, the Y axis and the Z axis in the world coordinate system).
The at least one preset time may be a time corresponding to a future predicted video frame, and by way of example, the current time is 10 th second, and the frame rate of the video acquired by the augmented reality device is 10 frames/second, then the at least one preset time may be 10.1 th second, 10.2 th second, and 10.3 th second.
In particular, in acquiring the translational information and the rotational information, a visual sensor or an inertial measurement unit (Inertial Measurement Unit, IMU) or the like may be employed. The translation information may include acceleration, moving direction, moving distance, etc. of the augmented reality device, and the rotation information may include angular velocity, rotating direction, rotating angle, etc. of the augmented reality device.
The current pose information may include current position information and current angle information, the predicted pose may include predicted position information and predicted angle information, the predicted position information may be determined based on the translation information and the current position information, and the predicted angle information may be determined based on the rotation information and the current angle information.
Here, it should be noted that, for the translation information (acceleration) and the rotation information (angular rate) determined at any one time, the predicted pose at a plurality of preset times may be determined, and the preset times may be times corresponding to the initial images acquired within a preset time from the current time. By way of example, it is possible to provide that: and predicting the predicted pose of the time corresponding to the multi-frame initial image acquired in the future 1s according to the translation information and the rotation information determined at any time.
By adopting the method, the pose can be predicted from multiple dimensions, so that the pose of the augmented reality device can be accurately predicted.
In one possible implementation, the augmented reality image includes at least one virtual object, where the virtual object may include a dynamic model (e.g., a skeletal animation model) and a static model (e.g., an animal model, a scene model), and the dynamic model may further include an interaction model that may interact with a user, and the interaction model may perform a dynamic action corresponding to a trigger operation of the user in response to the trigger operation. When generating the augmented reality image corresponding to the predicted pose, determining a display pose of the virtual object in the initial image based on the predicted pose, and adding the virtual object to the initial image according to the display pose to generate the augmented reality image.
For example, the virtual object may be a forest scene (including a model of various trees, vegetation and animals), and when generating an augmented reality image corresponding to the predicted pose, the display pose of the forest scene in the initial image may be determined based on the predicted pose, and the forest scene may be superimposed on the initial image to generate an augmented reality image of the forest scene.
In one possible implementation manner, in order to make the display effect of the generated augmented reality image more realistic, the illumination and color of the augmented reality image are generally rendered, so when the augmented reality image corresponding to the predicted pose is acquired, the illumination value and the illumination color value of the real scene where the augmented reality device is located may be detected first, and then the virtual scene corresponding to the predicted pose is rendered based on the illumination value and the illumination color value, so as to obtain the augmented reality image.
Specifically, the illuminance value is used to represent the intensity of the light, and the illumination color value is used to represent the color of the light. For example, when incandescent light is included in a real scene, the illuminance value and the illumination color value of the incandescent light may be detected, and then the presentation effect of the virtual scene may be rendered based on the detected illuminance value and illumination color value.
By adopting the method, the rays in the rendered virtual scene are close to the rays in the real scene, so that the virtual scene is better fused with the real scene, and the rendered augmented reality image is more vivid.
In a possible implementation manner, the distortion test results include at least one set of test results, each set of test results is obtained by processing a test resource corresponding to a virtual object based on the asynchronous time distortion function, and at least one of an augmented reality device corresponding to the test result, a format of the test resource, a complexity of the test resource, and a category of the test resource of different sets is different.
The augmented reality device may include different models and different types of (such as head-wearing type and glasses type) augmented reality devices, file formats that can be processed by different augmented reality devices are different, any one of the augmented reality devices may process a test resource in at least one format, complexity of the test resource may be determined based on model precision, number of virtual objects, range of virtual objects and resolution of the virtual objects in the test resource, as the number of virtual objects is greater, the model precision is greater, the range of virtual objects is greater, the resolution of virtual objects is greater, the complexity of the test resource is greater, and types of the test resource may include the dynamic model and the static model.
For example, a certain twisted test result may include three sets of test results, where the three sets of test results are a result a, a result B, and a result C, the complexity of the test resource corresponding to the result a is 5, the type of the test resource is a dynamic model, the complexity of the test resource corresponding to the result B is 5, the type of the test resource is a static model, the complexity of the test resource corresponding to the result C is 2, and the type of the test resource is a static model.
In one possible application scenario, any one of the augmented reality device corresponding to any one set of test results, the format of the test resource, the complexity of the test resource, and the type of the test resource is different. For example, if pose prediction capability and rendering capability of any augmented reality device for test resources with different complexity are to be tested, the format of the set of test resources and the type of the set of test resources are the same, and the complexity of the test resources is multiple, for example, the set of test resources are all static models of format a, and the complexity of the test resources is 1, 3 and 5 respectively.
In one possible application scenario, because of the difference in asynchronous time warping functions of the capabilities of different augmented reality devices, even if the same resource is rendered with the same asynchronous time warping function, both the rendering process and rendering results on the different augmented reality devices may be different. In order to comprehensively test the asynchronous time warping function, a plurality of augmented reality devices can be selected in advance, and all the test resources are rendered on each augmented reality device in sequence to obtain a plurality of warping test results.
In one possible application scenario, in the case of including multiple groups of test resources, the test resources may be classified according to parameter dimensions, then at least one group of test resources in the triggered parameter dimensions is determined in response to a selection operation for any parameter class in the parameter dimensions, and the test resources are input to the augmented reality device for testing. Wherein the parameter dimension may include the augmented reality device, a format of the test resource, a complexity of the test resource, a category of the test resource.
For example, in the case where the multiple groups of test resources are classified according to the augmented reality device, the parameter types of the test resources may be classified into "a device test resources" and "B device test resources", and in response to a trigger operation for the "a device test resources", the preset test resources applied to the a device may be input to the augmented reality device for testing.
By adopting the method, the performance of the asynchronous distortion function of the augmented reality device can be tested from multiple dimensions, so that the performance of the asynchronous distortion function of the augmented reality device can be more comprehensively reflected from the multiple dimensions.
Aiming at step 102,
The performance parameter information is obtained by rendering the test resource by using an asynchronous distortion function of the augmented reality device, and can be used for evaluating the performance of the asynchronous distortion function. The parameter performance information may be a performance reflecting the asynchronous distortion function by a presentation effect of the generated augmented reality video and/or a process of generating the augmented reality video.
In one possible implementation, when determining the performance parameter information of the asynchronous time warp function based on the warp test result, performance analysis may be performed on the warp test result to determine the performance parameter information of the asynchronous time warp function under a plurality of performance parameters, where the performance parameters include one or more of a rendering frame rate, a pose translation error, and a pose rotation error of an augmented reality video composed of a plurality of frames of the augmented reality image.
Specifically, when determining the rendering frame rate of the augmented reality video, an exemplary frame cutting tool commonly used by a mobile device may be used to determine the rendering frame rate, such as an ARM tool or a high-pass frame cutting tool (Qualcomm frame cutting tool), and when performing performance analysis on the rendering frame rate, a standard requirement of the rendering frame rate may be obtained first, such as "a picture refresh rate is constant and not less than 60Hz" (where the picture refresh rate is the rendering frame rate), and then the standard requirement of the rendering frame rate may be compared with the rendering frame rate, such as checking whether the rendering frame rate is constant first, and then comparing a preset standard frame rate with the rendering frame rate.
Continuing the above example, if the rendering frame rate is 50Hz, the rendering frame rate is less than the standard frame rate, indicating that the performance of the asynchronous time warp function of the augmented reality device is poor, if the rendering frame rate floats up and down more (e.g., exceeds a preset floating standard), indicating that the performance of the asynchronous time warp function of the augmented reality device is poor, and if the rendering frame rate is not less than 60Hz and more stable (e.g., floats less than a preset floating standard), indicating that the performance of the asynchronous time warp function of the augmented reality device is good.
By adopting the method, the working efficiency of the augmented reality equipment for rendering the augmented reality video and the accuracy of predicting the pose can be evaluated.
In a possible implementation, in case the performance parameters include a pose translation error and a pose rotation error, when determining performance parameter information of the asynchronous time warping function based on the warping test result, a true pose of the augmented reality device may be obtained, and based on the true pose and the predicted pose, the pose translation error and the pose rotation error of the augmented reality device may be determined.
The true position and pose comprises true position information and true angle information, the pose translation error is used for representing a difference value between the true position information and the predicted position information of the augmented reality equipment, and the pose rotation error is used for representing a difference value between the true angle information and the predicted angle information of the augmented reality equipment.
When determining the true-value pose of the augmented reality device, the true-value pose of the augmented reality device may be determined by a synchronous localization and mapping (Simultaneous Localization and Mapping, SLAM) technique, for example, by at least one preset image acquisition device, and the true-value pose of the augmented reality device may be determined by an optical motion capturing system, for example, a Vicon system, and then the pose translation error and the pose rotation error may be calculated respectively.
Here, the true position information, the true angle information, the predicted position information, and the predicted angle information are all data in a world coordinate system established in advance, that is, the position information includes coordinates in an X-axis, a Y-axis, and a Z-axis, and the angle information is an angle with respect to the world coordinate system.
When performance analysis is performed based on the pose translation error, a preset standard translation error (for example, 1 cm) can be obtained first, then the standard translation error is compared with the pose translation error, when the pose translation error is smaller than the standard translation error, the performance of the augmented reality device is better, and when the pose translation error is not smaller than the standard translation error, the performance of the augmented reality device is poorer.
When performance analysis is performed based on the pose rotation error, a preset standard rotation error (for example, 0.5 °) can be obtained first, then the standard rotation error is compared with the pose rotation error, when the pose rotation error is smaller than the standard rotation error, the performance of the augmented reality device is better, and when the pose rotation error is not smaller than the standard rotation error, the performance of the augmented reality device is poorer.
In summary, by way of example, it may be determined that the performance of the augmented reality device is better when the rendering frame rate is constant and not less than 60Hz, the pose translation error is less than 1cm, and the pose rotation error is less than 0.5 °.
In a possible implementation, the performance parameter information of the asynchronous time warp function may also be determined by comparing the warp test results of different augmented reality devices. Specifically, the same test resources can be respectively input to different types of augmented reality devices with different models, a plurality of distortion test results are respectively obtained, and the plurality of distortion test results are compared to compare the performance of each augmented reality device.
In one possible implementation manner, performance analysis can also be performed through the display effect of the augmented reality video displayed by the display device. Specifically, if the speed of the augmented reality device for rendering the augmented reality video is slow, the display position of the virtual object in the display device may remain unchanged or the pose of the virtual object suddenly changes (i.e. a jitter phenomenon) along with the movement of the augmented reality device. Therefore, the performance of the asynchronous distortion function of the augmented reality device can be judged by observing whether the display position of the displayed virtual object is updated or not when the pose of the augmented reality device changes.
Here, in the case where the augmented reality video is a video rendered by the illuminance value and the illumination color value, the performance analysis may be performed by the same method.
By adopting the method, the difference between the predicted pose predicted by the augmented reality equipment and the true pose can be accurately calculated, so that the performance of the predicted pose of the augmented reality equipment can be accurately judged.
In a possible implementation manner, when determining the performance parameter information of the asynchronous time warping function, an augmented reality image corresponding to the predicted pose may be rendered first; then, evaluating the augmented reality image rendered by the augmented reality equipment according to a preset evaluation strategy to obtain a first prediction score; and/or receiving a second prediction score input by a user, wherein the second prediction score is obtained by evaluation based on the preset evaluation strategy; and finally, taking the first predictive score and/or the second predictive score as the performance parameter information.
Specifically, the preset evaluation policy may include a policy of determining the rendering frame rate, the pose translation error, the pose rotation error, and a score of the display effect of the augmented reality video, and by way of example, the score corresponding to the rendering frame rate may be set, and/or the score corresponding to the pose translation error may be set, and/or the score corresponding to the pose rotation error may be set, and then the first prediction score may be determined based on at least one score, for example, the first prediction score may be determined according to a mathematical algorithm such as weighted summation, or averaging of different weights in the preset evaluation policy, and the preset evaluation policy may include a subjective evaluation policy of the display effect of the augmented reality video, for example, a score corresponding to a delay, a ghost, and a degree of jitter may be set, so that a second prediction score input by a user according to the preset evaluation policy may be received.
By adopting the method, the first prediction score generated from the data analysis angle and the second prediction score input from the user use angle can be obtained, so that the performance of the asynchronous distortion function of the augmented reality device is more comprehensively reflected based on the first prediction score and the second prediction score.
In one possible implementation, after the performance parameter information is generated, the performance parameter information may be sent to a display device for display. By way of example, a "render frame rate" may be presented on a display device: 57Hz, pose translation error: x:0.1cm; y:0.2cm; z:0.15cm, pose rotation error: 0.3 DEG'
In one possible implementation manner, an evaluation condition corresponding to each performance parameter information may be obtained, and then a test result of an asynchronous distortion function of the augmented reality device is determined based on the performance parameter information and the evaluation condition.
Specifically, when the performance parameter information satisfies the evaluation condition, it may be determined that the performance of the augmented reality device satisfies a requirement, and when the performance parameter information does not satisfy the evaluation condition, it is determined that the performance of the augmented reality device does not satisfy the requirement. The evaluation condition may be that the rendering frame rate is constant and not less than the standard frame rate, and the pose shift error is higher than the standard shift error, and the pose rotation error is higher than the standard rotation error, for example, "a picture refresh rate (i.e., the above rendering frame rate) is constant and not less than 60Hz" and "a shift error of predicted pose (i.e., the above pose shift error) is less than 1cm, and a rotation error (i.e., the above pose rotation error) is less than 0.5 °".
In one possible implementation, the augmented reality system operation of the augmented reality device includes the augmented reality device performing the asynchronous time warping function while following head movement; when obtaining the distortion test results of the asynchronous time-warping function of the augmented reality system, the distortion test results of the asynchronous time-warping function of the augmented reality system at a plurality of complexities may be obtained.
Specifically, the head movement may include head swinging in all directions, and after the augmented reality system of the augmented reality device is started due to head position change caused by movement of an experimenter wearing the augmented reality device, testing is sequentially performed by using testing resources with different complexity, and the augmented reality device is continuously rendered along with the head movement, for example, the pose of the virtual character in the display device is updated. And comparing the predicted pose in the distortion test result with the true pose according to the method, judging whether the rendering frame rate in the distortion test result is stable, observing whether the generated augmented reality video is smooth, dithered and jammed, and not updating.
In the following, an overview of the overall test flow of the above-described test method for asynchronous time warping functions will be given, and the following optical perspective device may be understood as the above-described augmented reality device.
During testing, the optical test equipment works as follows:
1. for an optical perspective device, a real-world system on an enhanced mobile device should be able to predict what the user will be in the pose when they see the virtual before sending the rendered image to the display screen, and distort the rendered image accordingly.
In order to enable a unified standard in testing the relocation module, the following evaluation criteria may be set:
2. the asynchronous time warping function of the augmented reality system on the optical perspective device should fulfil the following requirements:
a) The picture refresh rate is constant and not less than 60Hz;
b) The translation error of the predicted pose is less than 1cm and the rotation error is less than 0.5 degrees.
The frame refresh rate is the rendering frame rate, the translation error is the pose translation error, and the rotation error is the pose rotation error.
The following are two specific test methods:
3. the function of testing asynchronous time distortion by wearing the optical perspective device is that an augmented reality system is started in a laboratory with any head movement, a 6DOF predicted track is recorded under different complexities of a loaded virtual scene, and compared with SLAM, the rendering effect displayed on a screen of a mobile device is observed, so that whether the augmented reality system has the following functions is judged:
a) Whether the frame rate is stable, whether the rendered image is smooth and delay-free;
b) Whether the rendered image is dithered or not updated during movement.
The predicted track with six degrees of freedom (Six Degrees of Freedom,6 DOF) is composed of the predicted positions at the plurality of preset moments, the predicted position is compared with the SLAM to observe, namely, the predicted position is compared with the true position, and the frame rate is the rendering frame rate.
4. Asynchronous time warp performance test is as follows:
a) Providing static models in standard format and skeletal animation model complexity in different format for scenes, using estimated illuminance values and color values as ambient illuminance and colors for rendering, testing whether the rendering frame rate can be stabilized at a fixed frame rate of >60Hz, and testing the rendering running frame rate by a frame cutting tool commonly used by mobile equipment, such as ARM or a high-pass frame cutting tool;
b) The 6DOF predicted trajectory and SLAM trajectory are recorded and then compared numerically.
The skeleton animation model is the dynamic model, the fixed frame rate is the standard frame rate, and the running frame rate is the rendering frame rate.
According to the testing method for the asynchronous time warping function, the warping testing result of the asynchronous time warping function can be obtained in the working process of the augmented reality system, and the performance parameter information used for representing the asynchronous time warping function is determined based on the predicted pose, the augmented reality image and other data in the warping testing result. By adopting the method, the performance of the asynchronous time warping function can be objectively reflected through the performance parameters, so that the performance evaluation of the asynchronous time warping function is more accurate.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
Based on the same inventive concept, the embodiments of the present disclosure further provide a test device for an asynchronous time warp function corresponding to the test method for an asynchronous time warp function, and since the principle of solving the problem of the device in the embodiments of the present disclosure is similar to that of the test method for an asynchronous time warp function in the embodiments of the present disclosure, the implementation of the device may refer to the implementation of the method, and the repetition is omitted.
Referring to fig. 2, a schematic architecture diagram of a test apparatus for asynchronous time warping function according to an embodiment of the disclosure is provided, where the apparatus includes: an acquisition module 201 and a determination module 202; wherein,
an obtaining module 201, configured to obtain a distortion test result of an asynchronous time distortion function of an augmented reality system in an operation process of the augmented reality system of an augmented reality device;
a determining module 202 is configured to determine performance parameter information of the asynchronous time warp function based on the warp test result.
In a possible implementation, the obtaining module 201 is configured, when obtaining a distortion test result of an asynchronous time distortion function of the augmented reality system, to:
and acquiring a predicted pose of the augmented reality device, and acquiring an augmented reality image corresponding to the predicted pose.
In a possible implementation manner, the obtaining module 201 is configured, when obtaining the predicted pose of the augmented reality device, to:
acquiring translation information and rotation information of the augmented reality equipment;
and determining the predicted pose of the augmented reality equipment at least one preset moment based on the translation information, the rotation information and the current pose information of the augmented reality equipment.
In a possible implementation, the determining module 202 is configured to, when determining the performance parameter information of the asynchronous time warp function based on the warp test result:
performing performance analysis on the distortion test result to determine performance parameter information of the asynchronous time distortion function under various performance parameters; the performance parameters include one or more of a rendering frame rate, a pose translation error, and a pose rotation error of an augmented reality video composed of multiple frames of the augmented reality image.
In one possible embodiment, the distortion test results comprise at least one set of test results; the augmented reality image comprises at least one virtual object;
each group of test results is obtained by processing test resources corresponding to the virtual object based on the asynchronous time warping function, and at least one of the augmented reality equipment corresponding to the test results, the formats of the test resources, the complexity of the test resources and the types of the test resources in different groups is different.
In a possible implementation manner, the obtaining module 201 is configured, when obtaining the augmented reality image corresponding to the predicted pose, to:
detecting the illumination value and the illumination color value of a real scene where the augmented reality equipment is located;
and rendering the virtual scene corresponding to the predicted pose based on the illumination value and the illumination color value to obtain the augmented reality image.
In a possible implementation, in case the performance parameters comprise a pose translation error and a pose rotation error, the determining module 202 is configured to, when determining the performance parameter information of the asynchronous time warp function based on the warp test result:
And acquiring a true position and a pose of the augmented reality equipment, and determining a pose translation error and a pose rotation error of the augmented reality equipment based on the true position and the predicted pose.
In a possible implementation, the determining module 202 is configured, when determining the performance parameter information of the asynchronous time warp function, to:
rendering an augmented reality image corresponding to the predicted pose;
evaluating the augmented reality image rendered by the augmented reality device according to a preset evaluation strategy to obtain a first prediction score; and/or receiving a second prediction score input by a user, wherein the second prediction score is obtained by evaluation based on the preset evaluation strategy;
and taking the first prediction score and/or the second prediction score as the performance parameter information.
In a possible implementation, the method of testing the asynchronous time warp function is performed before the asynchronous time warp function is integrated in the application;
and/or the asynchronous time warping function is for integration in an augmented reality application.
In a possible implementation manner, the determining module 202 is further configured to:
Acquiring evaluation conditions corresponding to the performance parameter information;
and determining a test result of the asynchronous distortion function of the augmented reality device based on the performance parameter information and the evaluation condition.
In one possible implementation, the augmented reality system operation of the augmented reality device includes the augmented reality device performing the asynchronous time warping function while following head movement;
the obtaining module 201 is configured to, when obtaining a distortion test result of an asynchronous time distortion function of the augmented reality system:
and obtaining a distortion test result of an asynchronous time distortion function of the augmented reality system under various complexities.
The process flow of each module in the apparatus and the interaction flow between the modules may be described with reference to the related descriptions in the above method embodiments, which are not described in detail herein.
Based on the same technical concept, the embodiment of the disclosure also provides computer equipment. Referring to fig. 3, a schematic diagram of a computer device 300 according to an embodiment of the disclosure includes a processor 301, a memory 302, and a bus 303. The memory 302 is configured to store execution instructions, including a memory 3021 and an external memory 3022; the memory 3021 is also referred to as an internal memory, and is used for temporarily storing operation data in the processor 301 and data exchanged with the external memory 3022 such as a hard disk, and the processor 301 exchanges data with the external memory 3022 through the memory 3021, and when the computer device 300 operates, the processor 301 and the memory 302 communicate with each other through the bus 303, so that the processor 301 executes the following instructions:
Acquiring a distortion test result of an asynchronous time distortion function of an augmented reality system in the working process of the augmented reality system of augmented reality equipment;
performance parameter information of the asynchronous time warp function is determined based on the warp test result.
The disclosed embodiments also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of testing an asynchronous time warp function described in the method embodiments above. Wherein the storage medium may be a volatile or nonvolatile computer readable storage medium.
Embodiments of the present disclosure further provide a computer program product, where the computer program product carries program code, where the program code includes instructions for performing the steps of the method for testing an asynchronous time warp function described in the foregoing method embodiments, and specific reference may be made to the foregoing method embodiments, which are not repeated herein.
Wherein the above-mentioned computer program product may be realized in particular by means of hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied as a computer storage medium, and in another alternative embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), or the like.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in essence or a part contributing to the prior art or a part of the technical solution, or in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present disclosure, and are not intended to limit the scope of the disclosure, but the present disclosure is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, it is not limited to the disclosure: any person skilled in the art, within the technical scope of the disclosure of the present disclosure, may modify or easily conceive changes to the technical solutions described in the foregoing embodiments, or make equivalent substitutions for some of the technical features thereof; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the disclosure, and are intended to be included within the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.
If the technical scheme of the application relates to personal information, the product applying the technical scheme of the application clearly informs the personal information processing rule before processing the personal information and obtains the autonomous agreement of the individual. If the technical scheme of the application relates to sensitive personal information, the product applying the technical scheme of the application obtains individual consent before processing the sensitive personal information, and simultaneously meets the requirement of 'explicit consent'. For example, a clear and remarkable mark is set at a personal information acquisition device such as a camera to inform that the personal information acquisition range is entered, personal information is acquired, and if the personal voluntarily enters the acquisition range, the personal information is considered as consent to be acquired; or on the device for processing the personal information, under the condition that obvious identification/information is utilized to inform the personal information processing rule, personal authorization is obtained by popup information or a person is requested to upload personal information and the like; the personal information processing rule may include information such as a personal information processor, a personal information processing purpose, a processing mode, and a type of personal information to be processed.

Claims (14)

1. A method for testing an asynchronous time warp function, comprising:
Acquiring a distortion test result of an asynchronous time distortion function of an augmented reality system in the working process of the augmented reality system of augmented reality equipment;
performance parameter information of the asynchronous time warp function is determined based on the warp test result.
2. The method of claim 1, wherein the obtaining the distortion test results of the asynchronous time distortion function of the augmented reality system comprises:
and acquiring a predicted pose of the augmented reality device, and acquiring an augmented reality image corresponding to the predicted pose.
3. The method of claim 2, wherein the obtaining the predicted pose of the augmented reality device comprises:
acquiring translation information and rotation information of the augmented reality equipment;
and determining the predicted pose of the augmented reality equipment at least one preset moment based on the translation information, the rotation information and the current pose information of the augmented reality equipment.
4. A method according to claim 2 or 3, characterized in that said determining performance parameter information of the asynchronous time warp function based on the warp test result comprises:
Performing performance analysis on the distortion test result to determine performance parameter information of the asynchronous time distortion function under various performance parameters; the performance parameters include one or more of a rendering frame rate, a pose translation error, and a pose rotation error of an augmented reality video composed of multiple frames of the augmented reality image.
5. The method of any one of claims 2-4, wherein the distortion test results comprise at least one set of test results; the augmented reality image comprises at least one virtual object;
each group of test results is obtained by processing test resources corresponding to the virtual object based on the asynchronous time warping function, and at least one of the augmented reality equipment corresponding to the test results, the formats of the test resources, the complexity of the test resources and the types of the test resources in different groups is different.
6. A method according to claim 2 or 3, wherein the acquiring an augmented reality image corresponding to the predicted pose comprises:
detecting the illumination value and the illumination color value of a real scene where the augmented reality equipment is located;
and rendering the virtual scene corresponding to the predicted pose based on the illumination value and the illumination color value to obtain the augmented reality image.
7. The method of claim 4, wherein, in the case where the performance parameters include a pose translation error and a pose rotation error, the determining the performance parameter information for the asynchronous time warp function based on the warp test results comprises:
and acquiring a true position and a pose of the augmented reality equipment, and determining a pose translation error and a pose rotation error of the augmented reality equipment based on the true position and the predicted pose.
8. The method according to any of the claims 2-7, wherein said determining performance parameter information of the asynchronous time warp function comprises:
rendering an augmented reality image corresponding to the predicted pose;
evaluating the augmented reality image rendered by the augmented reality device according to a preset evaluation strategy to obtain a first prediction score; and/or receiving a second prediction score input by a user, wherein the second prediction score is obtained by evaluation based on the preset evaluation strategy;
and taking the first prediction score and/or the second prediction score as the performance parameter information.
9. Method according to any of claims 1-8, characterized in that the method of testing the asynchronous time warp function is performed before the integration of the asynchronous time warp function in the application;
And/or the asynchronous time warping function is for integration in an augmented reality application.
10. The method according to any one of claims 1 to 9, further comprising:
acquiring evaluation conditions corresponding to the performance parameter information;
and determining a test result of the asynchronous distortion function of the augmented reality device based on the performance parameter information and the evaluation condition.
11. The method according to any of claims 2-7, wherein the augmented reality system operation of the augmented reality device comprises the augmented reality device performing the asynchronous time warping function during following head movements;
the obtaining the distortion test result of the asynchronous time distortion function of the augmented reality system comprises the following steps:
and obtaining a distortion test result of an asynchronous time distortion function of the augmented reality system under various complexities.
12. A test apparatus for asynchronous time warp function, comprising:
the acquisition module is used for acquiring a distortion test result of an asynchronous time distortion function of the augmented reality system in the working process of the augmented reality system of the augmented reality device;
And the determining module is used for determining the performance parameter information of the asynchronous time warping function based on the warping test result.
13. A computer device, comprising: processor, memory and bus, said memory storing machine readable instructions executable by said processor, said processor and said memory communicating via the bus when the computer device is running, said machine readable instructions when executed by said processor performing the steps of the method for testing an asynchronous time warp function according to any of claims 1 to 11.
14. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the method of testing an asynchronous time warp function according to any of claims 1 to 11.
CN202210609139.0A 2022-05-31 2022-05-31 Method and device for testing asynchronous time warping function Pending CN117197399A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210609139.0A CN117197399A (en) 2022-05-31 2022-05-31 Method and device for testing asynchronous time warping function

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210609139.0A CN117197399A (en) 2022-05-31 2022-05-31 Method and device for testing asynchronous time warping function

Publications (1)

Publication Number Publication Date
CN117197399A true CN117197399A (en) 2023-12-08

Family

ID=88996578

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210609139.0A Pending CN117197399A (en) 2022-05-31 2022-05-31 Method and device for testing asynchronous time warping function

Country Status (1)

Country Link
CN (1) CN117197399A (en)

Similar Documents

Publication Publication Date Title
US11024014B2 (en) Sharp text rendering with reprojection
EP3137976B1 (en) World-locked display quality feedback
JP7304934B2 (en) Mixed reality system with virtual content warping and method of using it to generate virtual content
US8872853B2 (en) Virtual light in augmented reality
JP5976019B2 (en) Theme-based expansion of photorealistic views
CN108805979B (en) Three-dimensional reconstruction method, device, equipment and storage medium for dynamic model
US11893701B2 (en) Method for simulating natural perception in virtual and augmented reality scenes
CN109615703A (en) Image presentation method, device and the equipment of augmented reality
US11887246B2 (en) Generating ground truth datasets for virtual reality experiences
CN108139801B (en) System and method for performing electronic display stabilization via preserving light field rendering
CN109298629A (en) For providing the fault-tolerant of robust tracking to realize from non-autonomous position of advocating peace
EP3676653B1 (en) Generating a new frame using rendered content and non-rendered content from a previous perspective
CN110530356B (en) Pose information processing method, device, equipment and storage medium
US20200035020A1 (en) Spatial mapping fusion from diverse sensing sources
JP7222121B2 (en) Methods and Systems for Managing Emotional Compatibility of Objects in Stories
CN111161398A (en) Image generation method, device, equipment and storage medium
CN109478769A (en) Cable movable area display device, cable movable area display methods and cable movable area show program
CN117197399A (en) Method and device for testing asynchronous time warping function
US10713836B2 (en) Simulating lenses
EP4148537A1 (en) An image generator and method for an augmented reality system
US20230401759A1 (en) System and method for mixed reality
CN116450002A (en) VR image processing method and device, electronic device and readable storage medium
GB2612685A (en) An image generator and method for an augmented reality system
WO2023037113A1 (en) An image generator and method for an augmented reality system
CN117170524A (en) Initialization performance test method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination