CN117191343A - Method, device and system for testing illumination treatment - Google Patents

Method, device and system for testing illumination treatment Download PDF

Info

Publication number
CN117191343A
CN117191343A CN202210595091.2A CN202210595091A CN117191343A CN 117191343 A CN117191343 A CN 117191343A CN 202210595091 A CN202210595091 A CN 202210595091A CN 117191343 A CN117191343 A CN 117191343A
Authority
CN
China
Prior art keywords
illumination
test
determining
lighting
virtual object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210595091.2A
Other languages
Chinese (zh)
Inventor
盛崇山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shangtang Technology Development Co Ltd
Original Assignee
Zhejiang Shangtang Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shangtang Technology Development Co Ltd filed Critical Zhejiang Shangtang Technology Development Co Ltd
Priority to CN202210595091.2A priority Critical patent/CN117191343A/en
Publication of CN117191343A publication Critical patent/CN117191343A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The disclosure provides a method, a device and a system for testing light treatment, wherein the method comprises the following steps: respectively acquiring augmented reality images containing virtual objects under various illumination test environments; wherein different lighting test environments have different lighting conditions; determining illumination display information of the virtual object in each illumination test environment based on the augmented reality image; and determining an illumination test result based on illumination display information of the virtual object in each illumination test environment and illumination conditions respectively corresponding to each illumination test environment.

Description

Method, device and system for testing illumination treatment
Technical Field
The present disclosure relates to the field of augmented reality (Augmented Reality, AR) technology, and in particular, to a method, apparatus, and system for testing illumination processing.
Background
The light estimation (light estimation) is particularly used for estimating the light condition in a scene, is very important in the application of the technical field of augmented reality, and can enable a virtual object in an obtained augmented reality image to be more fit with a real environment and accord with the characteristics of brightness, shadow and the like of the position. In order to ensure accuracy in performing illumination processing on the virtual object, the illumination processing may be tested.
Disclosure of Invention
Embodiments of the present disclosure provide at least a method, apparatus, system, computer device, and computer-readable storage medium for testing light treatment.
In a first aspect, an embodiment of the present disclosure provides a method for testing illumination processing, including: respectively acquiring augmented reality images containing virtual objects under various illumination test environments; wherein different lighting test environments have different lighting conditions; determining illumination display information of the virtual object in each illumination test environment based on the augmented reality image; and determining an illumination test result based on illumination display information of the virtual object in each illumination test environment and illumination conditions respectively corresponding to each illumination test environment.
In this way, under various illumination test environments determined by different illumination conditions, corresponding augmented reality images can be obtained so as to utilize illumination display information of virtual objects in the augmented reality images to compare with the illumination conditions, determine illumination test results, and reflect whether illumination display information of virtual objects displayed by rendering can truly reflect illumination effects in the illumination test environments, so that illumination processing performances under different illumination conditions can be tested more comprehensively and accurately.
In an alternative embodiment, the lighting conditions comprise at least one of: at least one attribute value under illumination attribute and attribute value change information corresponding to the illumination attribute; the illumination attribute includes at least one of: illumination color, illumination intensity, illumination hue, illumination saturation, and illumination direction.
Therefore, the illumination estimation can be tested on multiple dimensions of illumination respectively, and the performance effect of the illumination estimation on different dimensions is determined, so that the reality of the obtained real scene image by utilizing the illumination estimation is ensured.
In an alternative embodiment, the lighting test environment comprises a display light source in a real scene; the method further includes constructing a plurality of lighting test environments in the following manner: determining various illumination test information; generating a control signal corresponding to each type of illumination test information aiming at the illumination test information in the plurality of types of illumination test information; and sending the control signal to a display light source, so that the display light source sends out test light according to the control signal, and the illumination test environment is generated.
In this way, since the control signal can be dynamically and continuously set, a plurality of illumination test environments can be dynamically and continuously constructed accordingly, and the reaction capability and the reaction effect of the illumination process in the fine change process of the test illumination test environment can be further tested compared with the way of independently creating different illumination test environments.
In an alternative embodiment, the method further comprises: and determining the illumination condition based on the illumination test information and the display error of the display light source to the illumination test information.
Therefore, when the illumination processing is tested, the illumination condition can be selected to be compared with the illumination display information displayed by the virtual object, and the illumination test result is determined, so that compared with the mode of comparing the illumination test information with the illumination display information, the influence of the display error of the display light source on the test is reduced, and the method is more accurate.
In an alternative embodiment, the control signal is configured to indicate at least one of: only white light is started, and the illuminance of the white light is controlled to be 100lx, 200lx, 300lx and 400lx respectively; only red light is started, and the illuminance of the red light is controlled to be 100lx, 200lx, 300lx and 400lx respectively; only turning on green light, and controlling the illuminance of the green light to be 100lx, 200lx, 300lx and 400lx respectively; only blue light is started, and the illuminance of the blue light is controlled to be 100lx, 200lx, 300lx and 400lx respectively.
In an alternative embodiment, the determining the plurality of lighting test information includes: displaying an adjusting button corresponding to the illumination test information on a graphical display interface; and responding to the adjustment operation of at least one adjustment button, and determining standard attribute information corresponding to the illumination attribute based on the result of the adjustment operation.
Therefore, visual attribute adjustment can be performed through the adjustment buttons displayed on the graphical display interface, and the operation is more convenient.
In an alternative embodiment, the respectively acquiring the augmented reality image including the virtual object under multiple illumination test environments includes: for each of a plurality of lighting test environments, under such lighting test environment, controlling an AR device to acquire the augmented reality image containing the virtual object.
In an alternative embodiment, the illumination display information of the virtual object includes: in the augmented reality image, display attribute values of each pixel point in a light receiving area corresponding to the virtual object under at least one illumination attribute.
In an optional implementation manner, the determining an illumination test result based on illumination display information of the virtual object under each illumination test environment and the illumination conditions respectively corresponding to each illumination test environment includes: determining illumination errors corresponding to each illumination test environment based on illumination display information of the virtual object in the illumination test environment and illumination conditions in the illumination test environment; and determining the illumination test result based on illumination errors respectively corresponding to the multiple illumination test environments.
In this way, by determining the illumination error, the abstract illumination estimation can be expressed in a numerical manner, that is, the accuracy of the illumination estimation is measured by using a numerical value, and the method is more direct and easier to operate, and the test result of the illumination estimation can be obtained more accurately.
In an optional implementation manner, the determining, for each lighting test environment, the lighting error corresponding to the lighting test environment based on the lighting display information of the virtual object under the lighting test environment and the lighting condition under the lighting test environment includes: respectively normalizing the illumination display information and the illumination condition to obtain normalized illumination display information and normalized illumination condition; and determining the difference between the normalized illumination test information and the normalized illumination condition, and determining the difference as the illumination error.
Therefore, the dimension of different illumination attributes is different, and the change result caused by the change difference of the numerical values is also different, so that the illumination error can be better measured in a normalization mode.
In an alternative embodiment, determining the lighting test result based on lighting errors respectively corresponding to multiple lighting test environments includes: comparing the illumination errors respectively corresponding to the various illumination test environments with a preset error threshold; determining that the illumination test result passes the test in response to illumination errors respectively corresponding to the plurality of illumination test environments being smaller than the preset error threshold; and determining that the illumination test result is failed in the test in response to the illumination error corresponding to any illumination test environment being greater than or equal to the preset error threshold.
Therefore, by setting the preset error threshold, the specific measurement standard judgment can be performed on the illumination test result after the illumination error is obtained.
In an optional implementation manner, the determining an illumination test result based on illumination display information of the virtual object under each illumination test environment and the illumination conditions corresponding to each illumination test environment respectively includes: determining illumination condition change response time of the augmented reality AR equipment based on illumination display information of the virtual object in each illumination test environment and the illumination conditions respectively corresponding to each illumination test environment; and determining the illumination test result based on the illumination condition change response time.
In an alternative embodiment, the determining the lighting test result based on the lighting condition change response time includes: comparing the illumination condition change response time with a preset response time threshold; determining that the illumination test result is a passing test in response to the illumination condition change response time being less than the preset response time threshold; and determining that the illumination test result is failed in response to the illumination condition change response time being greater than or equal to the preset response time threshold.
In this way, a determination may also be made of the response time of the light estimate to changes in the light test environment, and also include a determination of specific metrics.
In a second aspect, an embodiment of the present disclosure further provides a test apparatus for light treatment, including: the construction module is used for respectively acquiring the augmented reality images containing the virtual objects under various illumination test environments; wherein different lighting test environments have different lighting conditions; the first determining module is used for determining illumination display information of the virtual object in the illumination test environment based on the augmented reality image; the second determining module is used for determining an illumination test result based on illumination display information of the virtual object in each illumination test environment and the illumination conditions respectively corresponding to each illumination test environment.
In an alternative embodiment, the lighting conditions comprise at least one of: at least one attribute value under illumination attribute and attribute value change information corresponding to the illumination attribute; the illumination attribute includes at least one of: illumination color, illumination intensity, illumination hue, illumination saturation, and illumination direction.
In an alternative embodiment, the lighting test environment comprises a display light source in a real scene; the testing device also comprises a processing module for constructing various illumination testing environments by adopting the following modes: determining various illumination test information; generating a control signal corresponding to each type of illumination test information aiming at the illumination test information in the plurality of types of illumination test information; and sending the control signal to a display light source, so that the display light source sends out test light according to the control signal, and the illumination test environment is generated.
In an alternative embodiment, the processing module is further configured to: and determining the illumination condition based on the illumination test information and the display error of the display light source to the illumination test information.
In an alternative embodiment, the control signal is configured to indicate at least one of: starting white light, and controlling the illuminance of the white light to be 100lx, 200lx, 300lx and 400lx respectively; the red light is started, and the illuminance of the red light is controlled to be 100lx, 200lx, 300lx and 400lx respectively; the green light is started, and the illuminance of the green light is controlled to be 100lx, 200lx, 300lx and 400lx respectively; blue light is started, and the illuminance of the blue light is controlled to be 100lx, 200lx, 300lx and 400lx respectively.
In an alternative embodiment, the processing module, when determining the plurality of illumination test information, is configured to: displaying an adjusting button corresponding to the illumination test information on a graphical display interface; and responding to the adjustment operation of at least one adjustment button, and determining standard attribute information corresponding to the illumination attribute based on the result of the adjustment operation.
In an alternative embodiment, the construction module is configured to, in a plurality of illumination test environments, respectively obtain augmented reality images including virtual objects: for each of a plurality of lighting test environments, under such lighting test environment, controlling an AR device to acquire the augmented reality image containing the virtual object.
In an alternative embodiment, the illumination display information of the virtual object includes: in the augmented reality image, display attribute values of each pixel point in a light receiving area corresponding to the virtual object under at least one illumination attribute.
In an optional implementation manner, the second determining module is configured to, when determining an illumination test result based on illumination display information of the virtual object under each of the illumination test environments and the illumination conditions respectively corresponding to each of the illumination test environments, determine: determining illumination errors corresponding to each illumination test environment based on illumination display information of the virtual object in the illumination test environment and illumination conditions in the illumination test environment; and determining the illumination test result based on illumination errors respectively corresponding to the multiple illumination test environments.
In an optional implementation manner, when determining, for each lighting test environment, an illumination error corresponding to the lighting test environment based on the illumination display information of the virtual object under the lighting test environment and the illumination condition under the lighting test environment, the second determining module is configured to: respectively normalizing the illumination display information and the illumination condition to obtain normalized illumination display information and normalized illumination condition; and determining a difference value between the normalized illumination display information and the normalized illumination condition, and determining the difference value as the illumination error.
In an optional implementation manner, the second determining module is configured to, when determining the illumination test result based on illumination errors respectively corresponding to multiple illumination test environments: comparing the illumination errors respectively corresponding to the various illumination test environments with a preset error threshold; determining that the illumination test result passes the test in response to illumination errors respectively corresponding to the plurality of illumination test environments being smaller than the preset error threshold; and determining that the illumination test result is failed in the test in response to the illumination error corresponding to any illumination test environment being greater than or equal to the preset error threshold.
In an optional implementation manner, the second determining module is configured to, when determining an illumination test result based on illumination display information of the virtual object under each illumination test environment and the illumination conditions respectively corresponding to each illumination test environment, determine: determining illumination condition change response time of the augmented reality AR equipment based on illumination display information of the virtual object in each illumination test environment and the illumination conditions respectively corresponding to each illumination test environment; and determining the illumination test result based on the illumination condition change response time.
In an alternative embodiment, the second determining module is configured, when determining the lighting test result based on the lighting condition change response time, to: comparing the illumination condition change response time with a preset response time threshold; determining that the illumination test result is a passing test in response to the illumination condition change response time being less than the preset response time threshold; and determining that the illumination test result is failed in response to the illumination condition change response time being greater than or equal to the preset response time threshold.
In a third aspect, an optional implementation manner of the present disclosure further provides a test system for illumination processing, which is characterized in that the test system includes: a test device, and an augmented reality AR device; the test device is configured to test the light treatment performance of the AR device by using the test method according to the first aspect.
In an optional implementation manner, in the testing process of the lighting treatment performance, the AR device obtains an augmented reality image including a virtual object under each lighting test environment, and sends the augmented reality image to the test device, so that the test device completes the testing of the lighting treatment performance of the AR device based on the augmented reality image sent by the AR device.
In an alternative embodiment, the AR device is configured to acquire an augmented reality image including the virtual object in the following manner: shooting a real scene image and determining position information of the virtual object in a camera coordinate system corresponding to the real scene image under each of a plurality of illumination test environments; performing illumination estimation processing on the real scene image to obtain an illumination estimation result of the illumination test environment; and rendering the virtual object based on the illumination estimation result and the position information to obtain the augmented reality image.
In an alternative embodiment, the illumination estimation result includes: an attribute estimation value corresponding to at least one illumination attribute; the AR device is configured to, when performing rendering processing on the virtual object based on the illumination estimation result and the position information to obtain the augmented reality image: determining a light receiving area of the virtual object affected by the illumination of the display light source based on the position information; and rendering the light receiving area of the virtual object based on the attribute estimation value corresponding to at least one illumination attribute to obtain the augmented reality image.
In a fourth aspect, an optional implementation of the disclosure further provides a computer device, a processor, and a memory, where the memory stores machine-readable instructions executable by the processor, and the processor is configured to execute the machine-readable instructions stored in the memory, where the machine-readable instructions, when executed by the processor, perform the steps of the first aspect, or any possible implementation of the first aspect.
In a fifth aspect, an alternative implementation of the present disclosure further provides a computer readable storage medium having stored thereon a computer program which when executed performs the steps of the first aspect, or any of the possible implementation manners of the first aspect.
The description of the effect of the above-mentioned apparatus, system, computer device, and computer-readable storage medium for testing light treatment is referred to the description of the above-mentioned method for testing light treatment, and is not repeated here.
The foregoing objects, features and advantages of the disclosure will be more readily apparent from the following detailed description of the preferred embodiments taken in conjunction with the accompanying drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for the embodiments are briefly described below, which are incorporated in and constitute a part of the specification, these drawings showing embodiments consistent with the present disclosure and together with the description serve to illustrate the technical solutions of the present disclosure. It is to be understood that the following drawings illustrate only certain embodiments of the present disclosure and are therefore not to be considered limiting of its scope, for the person of ordinary skill in the art may admit to other equally relevant drawings without inventive effort.
FIG. 1 illustrates a flow chart of a method of testing light treatment provided by an embodiment of the present disclosure;
FIG. 2 illustrates a schematic diagram of a graphical display interface provided by an embodiment of the present disclosure;
FIG. 3 shows a schematic diagram of a test system provided by an embodiment of the present disclosure;
FIG. 4 illustrates a schematic diagram of a realistic scene image provided by an embodiment of the present disclosure;
FIG. 5 illustrates a schematic diagram of an augmented reality image provided by an embodiment of the present disclosure;
FIG. 6 shows a schematic diagram of a light treatment testing apparatus provided by an embodiment of the present disclosure;
fig. 7 shows a schematic diagram of a computer device provided by an embodiment of the disclosure.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, but not all embodiments. The components of the disclosed embodiments generally described and illustrated herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be made by those skilled in the art based on the embodiments of this disclosure without making any inventive effort, are intended to be within the scope of this disclosure.
According to research, when the augmented reality image is acquired, in order to improve the authenticity of the virtual object, the illumination condition in the real scene can be acquired by adopting an illumination estimation mode, so that the virtual object can render different illumination colors, brightness, shadows and other characteristics. In order to ensure accuracy in the illumination process, the illumination process may be tested.
Based on the above study, the disclosure provides a test method for lighting treatment, specifically, in various lighting test environments corresponding to different lighting conditions, by comparing the obtained lighting display information of the virtual object in the augmented reality image with the lighting conditions, determining a lighting test result, so as to reflect whether the lighting display information of the virtual object displayed by rendering can truly reflect the lighting effect under the lighting test environment, thereby more comprehensively and accurately testing the lighting treatment performance under different lighting conditions.
The present invention is directed to a method for manufacturing a semiconductor device, and a semiconductor device manufactured by the method.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
For the sake of understanding the present embodiment, first, a detailed description will be given of a method for testing light treatment disclosed in an embodiment of the present disclosure, where an execution subject of the method for testing light treatment provided in the embodiment of the present disclosure is generally a computer device having a certain computing capability, and the computer device includes, for example: the terminal device, or server or other processing device, may be a User Equipment (UE), mobile device, user terminal, cellular telephone, cordless telephone, personal digital assistant (Personal Digital Assistant, PDA), handheld device, computing device, vehicle mounted device, wearable device, etc. In some possible implementations, the method of testing illumination processing may be implemented by way of a processor invoking computer readable instructions stored in a memory.
The following describes a test method of light treatment provided in the embodiments of the present disclosure. The test method for illumination processing provided by the embodiment of the disclosure can be applied to various scenes. For example, during debugging of a particular algorithm or model of the illumination process; or after the illumination estimation algorithm is deployed in the AR device, it may be tested whether the performance meets the requirements of the relevant standard or not when the AR device performs illumination estimation based on the illumination estimation algorithm. The testing method provided by the embodiment of the disclosure can be utilized to determine whether the illumination display information of the virtual object in the illumination testing environment in the current debugging process is in error with the illumination conditions, so that a specific algorithm or model can be improved again according to the obtained error, or whether the algorithm meets the related standard requirement is detected.
Referring to fig. 1, a flowchart of a testing method of illumination processing according to an embodiment of the disclosure is shown, where the method includes steps S101 to S103, where:
s101: respectively acquiring augmented reality images containing virtual objects under various illumination test environments; wherein different lighting test environments have different lighting conditions;
s102: determining illumination display information of the virtual object in each illumination test environment based on the augmented reality image;
s103: and determining an illumination test result based on illumination display information of the virtual object in each illumination test environment and the illumination conditions respectively corresponding to each illumination test environment.
The embodiment of the disclosure utilizes a plurality of augmented reality images containing virtual objects, which are acquired under illumination test environments with different illumination conditions, so as to determine illumination display information rendered under the illumination test environments through the virtual objects displayed in the augmented reality images, and compares the illumination display information with the illumination conditions in each illumination test environment to determine an illumination test result for evaluating illumination treatment. Because different illumination conditions are considered, the illumination treatment performance under different illumination conditions can be more comprehensively and accurately tested.
The following describes the above-mentioned steps S101 to S103 in detail.
For S101 described above, in order to accurately test the illumination processing, a manner of constructing a plurality of illumination test environments is selected in the embodiment of the present disclosure. Wherein different lighting test environments have different lighting conditions. The illumination conditions described herein are illumination information that affects a real illumination effect in a real scene.
In particular, the lighting conditions may comprise at least one of: at least one attribute value under illumination attribute and attribute value change information corresponding to the illumination attribute; the illumination attribute includes at least one of: illumination color, illumination intensity, illumination hue, illumination saturation, and illumination direction.
First, a plurality of illumination attributes will be described. The illumination color includes the color of light in the illumination test environment, such as white, red, yellow, blue, green, etc. The illumination intensity includes a luminous flux per unit area, and if the luminous flux per unit area is large, the corresponding illumination intensity is high, and if the luminous flux per unit area is small, the corresponding illumination intensity is low. The shade of illumination may be related to the illumination color and/or the illumination intensity, such as a cold or warm shade, respectively, in a certain illumination color. Illumination saturation is related to illumination intensity, with higher illumination saturation in case of higher illumination intensity and lower illumination saturation in case of lower illumination intensity. And for the illumination direction, the illumination direction to the virtual object in the illumination test environment can be specifically used for determining.
In the process of testing the illumination processing, for example, a change process is dynamically and continuously tested, for example, under the condition that the illumination condition includes the illumination direction, the illumination processing method can have multiple illumination direction changes in multiple tests, for example, the illumination direction is controlled to gradually deviate by 5 degrees in the continuous multiple tests, and whether the virtual object is affected by illumination in different illumination directions or not can correspondingly display different rendering display effects is judged through the tests.
Therefore, for different illumination attributes, there is also corresponding attribute value change information. The attribute value change information may be specifically determined according to units of different illumination attributes and possible change ranges in a real scene. Taking the determination of the attribute value change information corresponding to the illumination color in the illumination attribute as an example for explanation, if the illumination color is represented by an RGB (Red-Green-Blue) color system, the corresponding attribute value change information includes, for example, the respective corresponding change degrees of each hue channel under different illumination test environments, for example, each time the pixel color of the Red hue channel changes by 10, and for example, the value range of 0-256 actually included in the Red hue channel may be, for example, respectively corresponding to 0, 10, 20, 30, etc. in a plurality of continuous illumination test environments. In addition, the degree of change may be modified, for example, the red hue channel corresponds to 0, 5, 10, 20, etc. in a plurality of continuous illumination test environments.
Therefore, the illumination estimation can be tested on multiple dimensions of illumination respectively, and the performance effect of the illumination estimation on different dimensions is determined, so that the reality of the obtained real scene image by utilizing the illumination estimation is ensured.
Specifically, the lighting test environment includes a display light source in a real scene. The display light source described herein specifically refers to a light source that determines a rendering display effect affecting a virtual object during a test of illumination processing. The display light source is actually present in the real scene with the corresponding light emitter, which may be a point light source or a surface light source, for example, a lamp or an electronic display screen.
When constructing multiple illumination test environments, multiple illumination test information can be determined first. Here, the lighting test information is different from the lighting conditions described above. Specifically, the illumination test information is artificially set information, namely ideal and standard information for setting in the test; the illumination test information can be utilized to enable the display light source to emit test light, so that an illumination test environment is generated. However, in practice, the test light completely consistent with the illumination test information is not generated during the processing of the illumination test information and when the display light source emits light. For example, the value of the red hue channel is set to 100 in the illumination test information, and after the illumination test information is processed and the display light source is made to emit light, the value of the red hue channel in the test light source may be 95 or 102, because the display light source has a display error on the illumination test information.
Therefore, when determining the lighting condition, the lighting condition can be determined specifically based on the lighting test information and the display error of the display light source to the lighting test information. Here, because the display errors corresponding to different display light sources are different, when the illumination condition is determined according to the illumination test information and the display errors, the corresponding display errors are determined and eliminated according to the different adopted display light sources, so that the accuracy of the obtained illumination condition is ensured. Therefore, when the illumination processing is tested, the illumination condition can be selected to be compared with the illumination display information displayed by the virtual object, and the illumination test result is determined, so that compared with the mode of comparing the illumination test information with the illumination display information, the influence of the display error of the display light source on the test is reduced, and the method is more accurate.
As can be seen from the above description, the illumination test information is artificially set information, and exemplary illumination test information may include, for example, illumination color of white, illumination intensity of 100 luxlx, and so on. In order to construct a plurality of illumination test environments by utilizing illumination test information, after determining the plurality of illumination test information, a control signal corresponding to each illumination test information in the plurality of illumination test information can be generated; and sending the control signal to a display light source, so that the display light source sends out test light according to the control signal, and the illumination test environment is generated.
In a specific implementation, when determining multiple kinds of illumination test information, since the illumination test information can cover multiple illumination attributes, and each illumination attribute also has corresponding attribute value change information, a specific manner of determining the illumination test information through an adjustment button is specifically provided in the embodiments of the present disclosure. The method specifically comprises the steps of displaying an adjusting button corresponding to the illumination test information on a graphical display interface; and responding to the adjustment operation of at least one adjustment button, and determining standard attribute information corresponding to the illumination attribute based on the result of the adjustment operation.
Exemplary, referring to fig. 2, a schematic diagram of a graphical display interface according to an embodiment of the disclosure is provided. The graphic display interface specifically comprises a plurality of illumination attributes and adjustment buttons corresponding to the illumination attributes, and the adjustment buttons are specifically displayed in the form of sliding operation bars in the figure. Wherein in the area of illumination properties, a currently selected portion of the illumination properties, such as the color of illumination and the intensity of illumination, may be determined in response to the selection operation. The adjustment button is used for adjustment, the value corresponding to the current illumination attribute is displayed after adjustment, and the adjustment mode of the adjustment button can be used for knowing that the value corresponding to the illumination attribute can be continuously changed. Specifically, the standard attribute information currently determined in the schematic diagram includes that the illumination color is white and the illumination intensity is 100 lux (lx).
In this way, a plurality of illumination test environments can be dynamically and continuously constructed correspondingly through the control signals, and compared with the mode of independently creating different illumination test environments, the reaction capability and the reaction effect of illumination treatment in the fine change process of the test illumination test environments can be further tested.
In another embodiment of the present disclosure, a specific test case for testing an illumination process is also provided. In the test case, the display light source emits different test lights by determining various control signals so as to complete the test of illumination processing. Wherein the control signal is for indicating at least one of:
only white light is started, and the illuminance of the white light is controlled to be 100lx, 200lx, 300lx and 400lx respectively;
only red light is started, and the illuminance of the red light is controlled to be 100lx, 200lx, 300lx and 400lx respectively;
only turning on green light, and controlling the illuminance of the green light to be 100lx, 200lx, 300lx and 400lx respectively;
only blue light is started, and the illuminance of the blue light is controlled to be 100lx, 200lx, 300lx and 400lx respectively.
Here, the test cases described above may be used as test cases corresponding to the basic test for the illumination process. According to the expression of the control signal, the test case does not relate to the condition of low illumination, such as the condition of controlling at 20lx illumination, but the corresponding illumination test environment can be continuously constructed in a supplementing mode. In addition, because the requirement on illumination estimation is high under low illumination, corresponding illumination estimation modes can be independently set in certain test scenes, and corresponding illumination test environments can be correspondingly constructed and tested by adopting the test modes provided in the embodiment of the disclosure.
After constructing a plurality of illumination test environments, augmented reality images including virtual objects may be acquired, respectively. The virtual object may specifically include a simple geometric body, such as a sphere, a cuboid, a cone, or the like, or may also include a virtual character, or the like.
In a specific implementation, the AR device renders and displays the virtual object by using the acquired real scene image to obtain an augmented reality image containing the virtual object. The virtual object displayed in the augmented reality image is rendered and displayed by considering the illumination influence in the real scene, namely, the influence of illumination conditions in the illumination test environment is determined in an illumination treatment mode, and the illumination effect is displayed. Detailed descriptions of the corresponding parts are not repeated here.
For the above S102, in order to test the illumination process, the illumination display information of the virtual object in the illumination test environment may be determined through the augmented reality image determined in the above S101. If the illumination processing is better, the color constancy can be ensured within a certain error range, namely illumination display information displayed by the virtual object in the augmented reality image accords with illumination conditions in the illumination test environment, otherwise, the illumination processing can be considered to not meet the processing requirements yet.
Wherein, the illumination display information of the virtual object comprises: in the augmented reality image, display attribute values of each pixel point in a light receiving area corresponding to the virtual object under at least one illumination attribute.
In one possible case, the illumination test environment specifically includes a display light source, and if the display light source is a point light source or a surface light source, for a virtual object, if an illumination effect in a real scene is simulated, the illumination effect achieved by the illumination of a certain side of the virtual object by the display light source will occur. For example, if the display light source is a surface light source, the virtual object is a cube, and the display light source irradiates one surface of the cube, then for the cube, in the process of illumination test estimation, it is more reasonable to test illumination display information correspondingly displayed on the surface irradiated by the display light source. Wherein the illumination display information indicates display information expressed by pixel values, gray values, etc. in the light receiving area due to illumination influence of test light in the illumination test environment.
The method for determining the light receiving area for the virtual object and determining the light display information in the light receiving area by means of light processing is described in detail below, and will not be repeated here.
For S103, after determining the illumination display information of the virtual object under each illumination test environment and the illumination conditions corresponding to each illumination test environment, the illumination test result may be determined.
In a specific implementation, in order to test the results of the illumination test under multiple illumination test environments to determine whether the illumination process has good processing capability under different illumination test environments, an illumination error corresponding to each illumination test environment can be determined based on illumination display information under the illumination test environment and illumination conditions under the illumination test environment; and determining the illumination test result based on illumination errors respectively corresponding to the multiple illumination test environments.
In one possible scenario, to facilitate measurement of illumination errors, normalization of illumination conditions and illumination display information may be performed. Wherein the lighting conditions are normalized to 0-1 by the computer device and the lighting display information may be normalized to 0-1 by the AR device similarly.
Specifically, after the illumination test information and the illumination condition are normalized respectively to obtain normalized illumination display information and normalized illumination condition, a difference value between the normalized illumination display information and the normalized illumination condition can be determined, and the difference value is determined as the illumination error. Because the illumination display information and the illumination condition are normalized results, the illumination error is also in the range of 0-1.
Wherein, since the illumination display information and the illumination condition can specifically include a plurality of illumination attributes, the illumination error under each illumination attribute can be determined separately for that illumination attribute. Alternatively, since the plurality of illumination attributes have a relationship of mutual influence, the illumination error may be determined directly from the pixel value expressed by the pixel point, the gray value expressed by the corresponding gray map, and the like. Specifically, the method can be determined according to actual conditions, and is not limited herein.
In one possible case, when determining the illumination test result by using the illumination error, the following manner may be specifically adopted: comparing the illumination errors respectively corresponding to the various illumination test environments with a preset error threshold; determining that the illumination test result passes the test in response to illumination errors respectively corresponding to the plurality of illumination test environments being smaller than the preset error threshold; and determining that the illumination test result is failed in the test in response to the illumination error corresponding to any illumination test environment being greater than or equal to the preset error threshold.
The preset error threshold value can be determined according to actual conditions. For example, if high accuracy in the light processing is to be ensured, that is, the light display information on the virtual object is close to the light condition, the difference between the light test information and the light condition after the normalization processing is correspondingly small, in this case, the preset error threshold is set to 0.1, for example. Generally, the preset error threshold value can be set to be 0.3, if the illumination errors respectively determined in various illumination test environments are smaller than the preset error threshold value of 0.3, the illumination processing can be considered to meet the precision requirement, or the human eyes can not feel that the illumination display information displayed by the virtual object has larger difference with the actual illumination condition when observing the augmented reality image under the condition that the illumination errors are smaller than 0.3. Therefore, when the illumination errors respectively corresponding to the various illumination test environments are smaller than the preset error threshold, the illumination test result can be determined to pass the test.
In another possible case, in the process of performing the illumination processing, in order to ensure that the illumination display information can timely follow the change of the illumination environment to generate the change, the response time of the AR device to the illumination condition change can be determined, and the illumination test result can be determined according to the illumination condition change response time.
Specifically, the change condition of the corresponding unit illumination display information can be determined in the change process of two continuous illumination test environments. Because the change of the illumination display information has hysteresis compared with the change of the illumination test environment, namely, the illumination display information changes after the illumination test environment changes. Thus, the method is applicable to a variety of applications. For example, in the case that it is determined that the first lighting test environment is changed to the second lighting test environment, the time elapsed in the course of changing the lighting display information of the first lighting test environment to the lighting display information of the second lighting test environment may be determined as the lighting condition change response time of the AR device.
In a specific implementation, when determining an illumination test result based on an illumination condition change response time, the illumination condition change response time and a preset response time threshold value may be compared; determining that the illumination test result is a passing test in response to the illumination condition change response time being less than the preset response time threshold; and determining that the illumination test result is failed in response to the illumination condition change response time being greater than or equal to the preset response time threshold.
The preset response time threshold may also be determined according to the actual situation. For example, if it is to be ensured that the light treatment is faster in response to a change in the lighting conditions, a correspondingly shorter response time is to be ensured, in which case the preset response time threshold is set to 0.1 seconds, for example. Generally, the preset response time threshold may be set to be 1 second, and if the response time is less than 1 second for each change of the illumination condition, the response time of the illumination treatment may be considered to meet the response requirement, or the human eye may not feel a longer change delay in the augmented reality image after the change of the illumination condition if the response time is less than 1 second. Thus, when the lighting condition change response time is less than the preset response time threshold, the lighting test result can be determined to pass the test. Otherwise, when the illumination condition change response time is greater than or equal to the preset response time threshold, determining that the illumination test result is not passed.
In another embodiment of the present disclosure, there is further provided a test system for illumination processing, for example, referring to fig. 3, a schematic diagram of the test system provided in the embodiment of the present disclosure specifically includes a test device 31 and an augmented reality AR device 32; the test device 31 is configured to test the light treatment performance of the AR device 32 by using the test method provided by the present disclosure, and specific reference may be made to the above specific description of the test method, which is not repeated here.
In a specific implementation, in the testing process of the lighting processing performance, the AR device 32 obtains an augmented reality image including a virtual object under each lighting test environment, and sends the augmented reality image to the test device 31, so that the test device 31 completes the testing of the lighting processing performance of the AR device 32 based on the augmented reality image sent by the AR device 32.
Wherein the test device 31 comprises, for example, a computer device in the test method described above. The test device 31 performs a process of testing the light treatment performance of the AR device 32 using the augmented reality image, and for details, reference may be made to the above description of the test method.
For the process that the AR device 32 acquires the augmented reality image including the virtual object in the test process, specifically, the AR device 32 may shoot the real scene image and determine the position information of the virtual object in the camera coordinate system corresponding to the real scene image in each of multiple illumination test environments; performing illumination estimation processing on the real scene image to obtain an illumination estimation result of the illumination test environment; and rendering the virtual object based on the illumination estimation result and the position information to obtain the augmented reality image.
Exemplary, referring to fig. 4, a schematic diagram of a real scene image is provided in an embodiment of the disclosure. The display screen can emit test light rays, and the display light source can emit test light rays. One side of the display screen can emit test light, and in the real scene image, the rendering display position of the virtual object in the rendered augmented reality image is shown by an X. By photographing a real scene, the position information of the position in the camera coordinate system can be recognized.
From the image of the real scene, it can be determined that at the "X" position in the real scene, the presence of the item will be affected by the illumination of the test light of the display screen. In order to make the rendered augmented reality image more realistic, the AR device 32 performs illumination estimation processing on the acquired real scene image to obtain an illumination estimation result from illumination information expressed by the real scene image, so as to render the virtual object to obtain the augmented reality image.
In the light estimation process using the real scene image, the direction of the main light source may be estimated, for example, in the above example, the display light source is a surface light source type display screen, and the direction of emitting the test light may be determined. The color brightness value of the main light source can also be estimated, for example, the estimated value corresponding to the test light emitted by the display light source under the above-described various illumination attributes is estimated. In addition, the environmental global illumination information can be estimated, for example, in a real scene, the environment global illumination information can also have the influence of other objects or natural light, so that the environment global illumination can be formed, and the environment global illumination information can be identified by an illumination estimation processing mode.
Specifically, the embodiments of the present disclosure provide the following several different illumination estimation processing methods, where:
a. gardner model (Gardner's)
In this model, an input image is an RGB image of a Low-Dynamic Range image (LDR) of a limited field of view (FOV), and a High-Dynamic Range image (HDR) panorama illumination intensity is output. The model is divided into an upper part and a lower part, wherein the upper part is trained by using LDR data, and the lower part is fine-tuned by using HDR data because HDR data is difficult to acquire. LDR data only predicts light source locations and RGB diagrams, and HDR data can then be used to predict illumination intensity.
b. Main Light source algorithm (domino Light)
In the algorithm, RGB-D data of a real scene image is input, and position information of a light source in a world coordinate system relative to AR equipment can be directly acquired, so that an illumination estimation result of the light source is obtained. However, in this algorithm, the relative positions of a plurality of light sources cannot be effectively identified, so that the algorithm is more suitable for use in an illumination test environment or an application scene with only a single light source.
c. Depth light algorithm (deep light)
In the algorithm, besides the test system, an image acquisition device is specifically required to be built, and three kinds of mirror balls, matte balls and diffuse reflection balls with different materials are respectively shot by the image acquisition device, and an illumination algorithm is obtained according to supervised training of shooting results.
In addition, the following algorithm or model is provided: a comprehensive illuminance estimation algorithm (Multi-illumination), a depth parameter estimation algorithm (Deep parameter), a spatial-varying algorithm (spatial-varying), a neural network illuminance estimation algorithm (Neural illumination), and an illuminance network model (illuminanet).
Here, by using any of the described illumination estimation processing methods, an attribute estimation value corresponding to at least one illumination attribute can be obtained as an illumination estimation result, for example.
According to the position information determined for the virtual object in the real scene image shown in fig. 4, and using the obtained illumination estimation result, the virtual object may be subjected to rendering processing to obtain an augmented reality image for testing. Exemplary, referring to fig. 5, a schematic diagram of an augmented reality image according to an embodiment of the disclosure is shown. The virtual object is a cuboid, and is displayed at the original X position, and according to the relative position relation between the display screen and the virtual object, the fact that the front surface of the display screen is opposite to the front surface of the virtual object, namely the surface A is marked, so that test light can have an influence on the surface A, the corresponding gray in the figure indicates that the surface is influenced by illumination, and different rendering display effects are displayed from other surfaces.
For example, in the rendering display effect of the a-plane, specifically, the display effect may include, for example, red color that is affected by the test light with the illumination color of red, highlight that is affected by the test light with the illumination intensity of high illumination intensity (for example, the corresponding normalization value is 0.9), or dark color that is affected by the test light with the illumination intensity of low illumination intensity (for example, the corresponding normalization value is 0.1), and so on.
In a specific implementation, a light receiving area of the virtual object, which is affected by the illumination of the display light source, can be determined based on the position information; and rendering the light receiving area of the virtual object based on the attribute estimation value corresponding to at least one illumination attribute to obtain the augmented reality image.
For the augmented reality image shown in fig. 5, it can be seen that, for example, the light receiving region affected by the illumination of the display light source determined for the virtual object based on the position information is specifically the region where the a-plane is located. By using the illumination estimation result, an attribute estimation value corresponding to at least one illumination attribute can be determined correspondingly, so as to perform rendering processing on the light receiving area of the virtual object, and obtain the augmented reality image shown in fig. 5.
Thus, the AR device can be tested for the light treatment performance of the AR device by using the augmented reality image obtained by the AR device.
In another embodiment of the present disclosure, a specific embodiment of testing the light treatment performance of the AR device by using the light treatment testing system provided by the embodiment of the present disclosure is also provided. In a real scene, a display screen connected with a computer device is used as a display light source, the computer device is used as a test device, and the mobile phone is determined to be AR equipment. The computer device also has a graphical display interface that may be used to present the adjustment buttons corresponding to the light test information described in the above embodiments.
In this embodiment, first, an adjustment button of the graphical display interface may be used to set on only white light and control the illuminance of the white light to be 100lx. Accordingly, after confirming the adjustment, the computer device may generate a corresponding control signal and send the control signal to the display light source, so that the display light source emits a corresponding test light. At this point, a first lighting test environment may be constructed. The corresponding color value of the illumination test information indicated by the control signal is, for example, 0.3 after being normalized by the computer device, and the corresponding color value is changed to 0.25 due to the display error of the display light source, for example, a deviation of 0.05 is generated on the color value. That is, the current actually generated lighting conditions, including the normalized color values, are 0.25.
The mobile phone is used for shooting a real scene containing a display light source, an illumination estimation result of a current illumination test environment can be determined from the acquired real scene image in an illumination estimation processing mode, and an augmented reality image containing a virtual object is generated. With the augmented reality image, the light-illuminated display information of the light-receiving region of the virtual object can be determined. The color value expressed in the illumination display information can be normalized to 0.35 by the handset, for example.
Here, since the current illumination condition includes a color value of 0.25 and the illumination display information processed in the augmented reality image includes a color value of 0.35, an illumination error between the two, that is, 0.1, can be calculated.
In one possible case, the preset error threshold value in the standard case may be determined to be 0.3. Because the illumination error 0.1 is smaller than the preset error threshold value 0.3, the illumination estimation meets the standard requirement corresponding to the illumination error in the illumination test environment.
Then, a second lighting test environment may be constructed via the adjustment buttons of the graphical display interface. The white light illuminance may be set to 200lx, and the illumination error may be determined, for example, in a similar manner as described above, and accordingly whether the illumination estimation meets the standard requirement corresponding to the illumination error in such an illumination test environment may be determined.
In addition, for the second illumination test environment, because the illumination attribute corresponding value is changed compared with the first illumination test environment, whether the response time of the mobile phone to the illumination condition change is within the preset response time threshold can be tested. Specifically, the time when the test light of the light source changes to the test light corresponding to the second illumination test environment can be displayed as the start time, the time when the augmented reality image changes along with the illumination attribute change is determined to be the end time, and the time difference between the start time and the end time is determined to be the illumination condition change response time.
By way of example, a start time of 10 th seconds and an end time of 10.3 th seconds may be determined, and a light condition change response time of 0.3 seconds may be determined. In one possible case, the preset response time threshold under standard conditions may be determined to be 1 second. And the response time of the illumination condition change is smaller than the preset response time of 1 second, wherein the response time of the illumination estimation is determined to meet the corresponding standard requirement.
According to the same mode, the method can be used for continuously starting only white light, two illumination test environments with white light illumination of 300lx and 400lx, four illumination test environments with red light illumination of 100lx, 200lx, 300lx and 400lx respectively, starting only green light, four illumination test environments with green light illumination of 100lx, 200lx, 300lx and 400lx respectively, starting only blue light, and four illumination test environments with blue light illumination of 100lx, 200lx, 300lx and 400lx respectively, determining corresponding illumination errors respectively, determining illumination condition change response time in two adjacent illumination tests, and determining whether illumination estimation can reach requirements according to different standard requirements corresponding to the two, so as to determine whether an illumination test result passes or fails the test.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
Based on the same inventive concept, the embodiment of the disclosure further provides an illumination processing testing device corresponding to the illumination processing testing method, and since the principle of solving the problem by the device in the embodiment of the disclosure is similar to that of the illumination processing testing method in the embodiment of the disclosure, the implementation of the device can refer to the implementation of the method, and the repetition is omitted.
Referring to fig. 6, a schematic diagram of a testing apparatus for illumination processing according to an embodiment of the disclosure is shown, where the apparatus includes: a construction module 61, a first determination module 62, and a second determination module 63; wherein,
a construction module 61, configured to obtain augmented reality images including virtual objects under multiple illumination test environments, respectively; wherein different lighting test environments have different lighting conditions;
a first determining module 62, configured to determine, based on the augmented reality image, illumination display information of the virtual object in the illumination test environment;
The second determining module 63 is configured to determine an illumination test result based on illumination display information of the virtual object under each of the illumination test environments and the illumination conditions respectively corresponding to each of the illumination test environments.
In an alternative embodiment, the lighting conditions comprise at least one of: at least one attribute value under illumination attribute and attribute value change information corresponding to the illumination attribute; the illumination attribute includes at least one of: illumination color, illumination intensity, illumination hue, illumination saturation, and illumination direction.
In an alternative embodiment, the lighting test environment comprises a display light source in a real scene; the testing apparatus further comprises a processing module 64 for constructing a plurality of lighting test environments in the following manner: determining various illumination test information; generating a control signal corresponding to each type of illumination test information aiming at the illumination test information in the plurality of types of illumination test information; and sending the control signal to a display light source, so that the display light source sends out test light according to the control signal, and the illumination test environment is generated.
In an alternative embodiment, the processing module 64 is further configured to: and determining the illumination condition based on the illumination test information and the display error of the display light source to the illumination test information.
In an alternative embodiment, the control signal is configured to indicate at least one of: starting white light, and controlling the illuminance of the white light to be 100lx, 200lx, 300lx and 400lx respectively; the red light is started, and the illuminance of the red light is controlled to be 100lx, 200lx, 300lx and 400lx respectively; the green light is started, and the illuminance of the green light is controlled to be 100lx, 200lx, 300lx and 400lx respectively; blue light is started, and the illuminance of the blue light is controlled to be 100lx, 200lx, 300lx and 400lx respectively.
In an alternative embodiment, the processing module 64, when determining the plurality of lighting test information, is configured to: displaying an adjusting button corresponding to the illumination test information on a graphical display interface; and responding to the adjustment operation of at least one adjustment button, and determining standard attribute information corresponding to the illumination attribute based on the result of the adjustment operation.
In an alternative embodiment, the constructing module 61 is configured to, in a plurality of illumination test environments, respectively obtain augmented reality images including virtual objects, where: for each of a plurality of lighting test environments, under such lighting test environment, controlling an AR device to acquire the augmented reality image containing the virtual object.
In an alternative embodiment, the illumination display information of the virtual object includes: in the augmented reality image, display attribute values of each pixel point in a light receiving area corresponding to the virtual object under at least one illumination attribute.
In an alternative embodiment, the second determining module 63 is configured to, when determining an illumination test result based on illumination display information of the virtual object under each of the illumination test environments and the illumination conditions respectively corresponding to each of the illumination test environments: determining illumination errors corresponding to each illumination test environment based on illumination display information of the virtual object in the illumination test environment and illumination conditions in the illumination test environment; and determining the illumination test result based on illumination errors respectively corresponding to the multiple illumination test environments.
In an alternative embodiment, the second determining module 63 is configured to, when determining, for each lighting test environment, a lighting error corresponding to the lighting test environment based on the lighting display information of the virtual object under the lighting test environment and the lighting conditions under the lighting test environment, and: respectively normalizing the illumination display information and the illumination condition to obtain normalized illumination display information and normalized illumination condition; and determining a difference value between the normalized illumination display information and the normalized illumination condition, and determining the difference value as the illumination error.
In an alternative embodiment, the second determining module 63 is configured to, when determining the lighting test result based on the lighting errors corresponding to the multiple lighting test environments, determine: comparing the illumination errors respectively corresponding to the various illumination test environments with a preset error threshold; determining that the illumination test result passes the test in response to illumination errors respectively corresponding to the plurality of illumination test environments being smaller than the preset error threshold; and determining that the illumination test result is failed in the test in response to the illumination error corresponding to any illumination test environment being greater than or equal to the preset error threshold.
In an alternative embodiment, the second determining module 63 is configured to, when determining an illumination test result based on illumination display information of the virtual object under each of the illumination test environments and the illumination conditions respectively corresponding to each of the illumination test environments: determining illumination condition change response time of the augmented reality AR equipment based on illumination display information of the virtual object in each illumination test environment and the illumination conditions respectively corresponding to each illumination test environment; and determining the illumination test result based on the illumination condition change response time.
In an alternative embodiment, the second determining module 63 is configured, when determining the lighting test result based on the lighting condition change response time, to: comparing the illumination condition change response time with a preset response time threshold; determining that the illumination test result is a passing test in response to the illumination condition change response time being less than the preset response time threshold; and determining that the illumination test result is failed in response to the illumination condition change response time being greater than or equal to the preset response time threshold.
The process flow of each module in the apparatus and the interaction flow between the modules may be described with reference to the related descriptions in the above method embodiments, which are not described in detail herein.
The embodiment of the disclosure further provides a computer device, as shown in fig. 7, which is a schematic structural diagram of the computer device provided by the embodiment of the disclosure, including:
a processor 10 and a memory 20; the memory 20 stores machine readable instructions executable by the processor 10, the processor 10 being configured to execute the machine readable instructions stored in the memory 20, the machine readable instructions when executed by the processor 10, the processor 10 performing the steps of:
Respectively acquiring augmented reality images containing virtual objects under various illumination test environments; wherein different lighting test environments have different lighting conditions; determining illumination display information of the virtual object in each illumination test environment based on the augmented reality image; and determining an illumination test result based on illumination display information of the virtual object in each illumination test environment and illumination conditions respectively corresponding to each illumination test environment.
The memory 20 includes a memory 210 and an external memory 220; the memory 210 is also referred to as an internal memory, and is used for temporarily storing operation data in the processor 10 and data exchanged with the external memory 220 such as a hard disk, and the processor 10 exchanges data with the external memory 220 via the memory 210.
The specific execution process of the above instruction may refer to the steps of the light treatment test method described in the embodiments of the present disclosure, which are not described herein.
The disclosed embodiments also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the light treatment test method described in the above method embodiments. Wherein the storage medium may be a volatile or nonvolatile computer readable storage medium.
The embodiments of the present disclosure further provide a computer program product, where the computer program product carries a program code, where instructions included in the program code may be used to perform the steps of the method for testing light treatment described in the foregoing method embodiments, and specifically reference may be made to the foregoing method embodiments, which are not described herein.
Wherein the above-mentioned computer program product may be realized in particular by means of hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied as a computer storage medium, and in another alternative embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), or the like.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in essence or a part contributing to the prior art or a part of the technical solution, or in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present disclosure, and are not intended to limit the scope of the disclosure, but the present disclosure is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, it is not limited to the disclosure: any person skilled in the art, within the technical scope of the disclosure of the present disclosure, may modify or easily conceive changes to the technical solutions described in the foregoing embodiments, or make equivalent substitutions for some of the technical features thereof; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the disclosure, and are intended to be included within the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (20)

1. A method of testing light treatment comprising:
respectively acquiring augmented reality images containing virtual objects under various illumination test environments; wherein different lighting test environments have different lighting conditions;
determining illumination display information of the virtual object in each illumination test environment based on the augmented reality image;
And determining an illumination test result based on illumination display information of the virtual object in each illumination test environment and illumination conditions respectively corresponding to each illumination test environment.
2. The test method of claim 1, wherein the lighting conditions comprise at least one of: at least one attribute value under illumination attribute and attribute value change information corresponding to the illumination attribute;
the illumination attribute includes at least one of: illumination color, illumination intensity, illumination hue, illumination saturation, and illumination direction.
3. The method according to claim 1 or 2, wherein the lighting test environment comprises a display light source in a real scene;
the method further includes constructing a plurality of lighting test environments in the following manner:
determining various illumination test information;
generating a control signal corresponding to each type of illumination test information aiming at the illumination test information in the plurality of types of illumination test information;
and sending the control signal to a display light source, so that the display light source sends out test light according to the control signal, and the illumination test environment is generated.
4. A method according to claim 3, characterized in that the method further comprises: and determining the illumination condition based on the illumination test information and the display error of the display light source to the illumination test information.
5. The method of claim 3 or 4, wherein the control signal is used to indicate at least one of:
starting white light, and controlling the illuminance of the white light to be 100lx, 200lx, 300lx and 400lx respectively;
the red light is started, and the illuminance of the red light is controlled to be 100lx, 200lx, 300lx and 400lx respectively;
the green light is started, and the illuminance of the green light is controlled to be 100lx, 200lx, 300lx and 400lx respectively;
blue light is started, and the illuminance of the blue light is controlled to be 100lx, 200lx, 300lx and 400lx respectively.
6. The method of any of claims 3-5, wherein determining a plurality of lighting test information comprises:
displaying an adjusting button corresponding to the illumination test information on a graphical display interface;
and responding to the adjustment operation of at least one adjustment button, and determining standard attribute information corresponding to the illumination attribute based on the result of the adjustment operation.
7. The method according to any one of claims 1-6, wherein separately acquiring the augmented reality image including the virtual object in the plurality of lighting test environments comprises:
for each of a plurality of lighting test environments, under such lighting test environment, controlling an AR device to acquire the augmented reality image containing the virtual object.
8. The method of any of claims 1-7, wherein the illuminated display information of the virtual object comprises: in the augmented reality image, display attribute values of each pixel point in a light receiving area corresponding to the virtual object under at least one illumination attribute.
9. The method according to any one of claims 1-8, wherein determining an illumination test result based on illumination display information of the virtual object under each of the illumination test environments and the illumination conditions respectively corresponding to each of the illumination test environments includes:
determining illumination errors corresponding to each illumination test environment based on illumination display information of the virtual object in the illumination test environment and illumination conditions in the illumination test environment;
and determining the illumination test result based on illumination errors respectively corresponding to the multiple illumination test environments.
10. The method of claim 9, wherein determining, for each lighting test environment, a lighting error corresponding to the lighting test environment based on lighting display information of the virtual object under the lighting test environment and lighting conditions under the lighting test environment, comprises:
Respectively normalizing the illumination display information and the illumination condition to obtain normalized illumination display information and normalized illumination condition;
and determining a difference value between the normalized illumination display information and the normalized illumination condition, and determining the difference value as the illumination error.
11. The method of claim 9 or 10, wherein determining the lighting test result based on the lighting errors respectively corresponding to the plurality of lighting test environments comprises:
comparing the illumination errors respectively corresponding to the various illumination test environments with a preset error threshold;
determining that the illumination test result passes the test in response to illumination errors respectively corresponding to the plurality of illumination test environments being smaller than the preset error threshold;
and determining that the illumination test result is failed in the test in response to the illumination error corresponding to any illumination test environment being greater than or equal to the preset error threshold.
12. The method according to any one of claims 1-11, wherein determining an illumination test result based on illumination display information of the virtual object under each of the illumination test environments and the illumination conditions respectively corresponding to each of the illumination test environments includes:
Determining illumination condition change response time of the augmented reality AR equipment based on illumination display information of the virtual object in each illumination test environment and the illumination conditions respectively corresponding to each illumination test environment;
and determining the illumination test result based on the illumination condition change response time.
13. The method of claim 12, wherein the determining the lighting test result based on the lighting condition change response time comprises:
comparing the illumination condition change response time with a preset response time threshold;
determining that the illumination test result is a passing test in response to the illumination condition change response time being less than the preset response time threshold;
and determining that the illumination test result is failed in response to the illumination condition change response time being greater than or equal to the preset response time threshold.
14. A light treatment testing device, comprising:
the construction module is used for respectively acquiring the augmented reality images containing the virtual objects under various illumination test environments; wherein different lighting test environments have different lighting conditions;
The first determining module is used for determining illumination display information of the virtual object in the illumination test environment based on the augmented reality image;
the second determining module is used for determining an illumination test result based on illumination display information of the virtual object in each illumination test environment and the illumination conditions respectively corresponding to each illumination test environment.
15. A test system for illumination processing, comprising: a test device, and an augmented reality AR device;
the test device for testing the light treatment performance of the AR device using the test method according to any one of claims 1 to 13.
16. The system of claim 15, wherein the AR device obtains an augmented reality image including a virtual object in each of the lighting test environments during the testing of the lighting performance, and transmits the augmented reality image to the testing device, such that the testing device completes the testing of the lighting performance of the AR device based on the augmented reality image transmitted by the AR device.
17. The system of claim 16, wherein the AR device is configured to acquire the augmented reality image including the virtual object by:
Shooting a real scene image and determining position information of the virtual object in a camera coordinate system corresponding to the real scene image under each of a plurality of illumination test environments;
performing illumination estimation processing on the real scene image to obtain an illumination estimation result of the illumination test environment;
and rendering the virtual object based on the illumination estimation result and the position information to obtain the augmented reality image.
18. The system of claim 17, wherein the illumination estimation result comprises: an attribute estimation value corresponding to at least one illumination attribute;
the AR device is configured to, when performing rendering processing on the virtual object based on the illumination estimation result and the position information to obtain the augmented reality image:
determining a light receiving area of the virtual object affected by the illumination of the display light source based on the position information;
and rendering the light receiving area of the virtual object based on the attribute estimation value corresponding to at least one illumination attribute to obtain the augmented reality image.
19. A computer device, comprising: a processor, a memory storing machine readable instructions executable by the processor for executing machine readable instructions stored in the memory, which when executed by the processor, perform the steps of the light treatment testing method according to any one of claims 1 to 13.
20. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when run by a computer device, performs the steps of the light treatment testing method according to any of claims 1 to 13.
CN202210595091.2A 2022-05-29 2022-05-29 Method, device and system for testing illumination treatment Pending CN117191343A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210595091.2A CN117191343A (en) 2022-05-29 2022-05-29 Method, device and system for testing illumination treatment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210595091.2A CN117191343A (en) 2022-05-29 2022-05-29 Method, device and system for testing illumination treatment

Publications (1)

Publication Number Publication Date
CN117191343A true CN117191343A (en) 2023-12-08

Family

ID=89003912

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210595091.2A Pending CN117191343A (en) 2022-05-29 2022-05-29 Method, device and system for testing illumination treatment

Country Status (1)

Country Link
CN (1) CN117191343A (en)

Similar Documents

Publication Publication Date Title
US10573067B1 (en) Digital 3D model rendering based on actual lighting conditions in a real environment
US11182961B2 (en) Method and system for representing a virtual object in a view of a real environment
CN105592310B (en) Method and system for projector calibration
CN108986199B (en) Virtual model processing method and device, electronic equipment and storage medium
US8976264B2 (en) Color balance in digital photography
CN106062862A (en) System and method for immersive and interactive multimedia generation
US9563982B2 (en) Image generating device, image generating method, program, and computer-readable information storage medium
CN109685746A (en) Brightness of image method of adjustment, device, storage medium and terminal
KR20100084718A (en) Mobile terminal for generating 3 dimensional image
CN109729281A (en) Image processing method, device, storage medium and terminal
AU2018225269B2 (en) Method, system and apparatus for visual effects
US20170206704A1 (en) Image processing apparatus, image capturing apparatus, image processing method and storage medium storing image processing program
EP2933781A2 (en) Method and system for representing a virtual object in a view of a real environment
KR20170072564A (en) Detecting method for color object in image using noise and detecting system for light emitting apparatus using noise
Angelopoulou et al. Evaluating the effect of diffuse light on photometric stereo reconstruction
US20190066366A1 (en) Methods and Apparatus for Decorating User Interface Elements with Environmental Lighting
CN109427089B (en) Mixed reality object presentation based on ambient lighting conditions
KR20160098540A (en) Creation device for color table, correction and control device for camera image and method thereof
CN117191343A (en) Method, device and system for testing illumination treatment
CN112532872B (en) Method and device for adjusting camera parameters, storage medium and electronic equipment
KR101488647B1 (en) Virtual illumination of operating method and apparatus for mobile terminal
CN108886608A (en) White balance adjustment device and its working method and working procedure
CN109076199A (en) White balance adjustment device and its working method and working procedure
CN108604427A (en) Control method, computer-readable medium and controller
US11928771B2 (en) Light source detection for extended reality technologies

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination