CN109461145B - Augmented reality illumination processing test method and device - Google Patents

Augmented reality illumination processing test method and device Download PDF

Info

Publication number
CN109461145B
CN109461145B CN201811211184.0A CN201811211184A CN109461145B CN 109461145 B CN109461145 B CN 109461145B CN 201811211184 A CN201811211184 A CN 201811211184A CN 109461145 B CN109461145 B CN 109461145B
Authority
CN
China
Prior art keywords
image
augmented reality
illumination
difference value
scene images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811211184.0A
Other languages
Chinese (zh)
Other versions
CN109461145A (en
Inventor
何玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201811211184.0A priority Critical patent/CN109461145B/en
Publication of CN109461145A publication Critical patent/CN109461145A/en
Application granted granted Critical
Publication of CN109461145B publication Critical patent/CN109461145B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Abstract

The invention discloses an augmented reality illumination processing test method and device. Wherein, the method comprises the following steps: respectively acquiring at least two real scene images corresponding to the augmented reality video image in at least two illumination parameter environments; respectively acquiring at least two synthetic scene images corresponding to an augmented reality synthetic video image in at least two illumination parameter environments, wherein the augmented reality synthetic video image is determined by setting a test object in the augmented reality video image; the augmented reality illumination process is tested using at least two real scene images and at least two composite scene images. The invention solves the technical problem of lack of testing AR display performance in the related technology.

Description

Augmented reality illumination processing test method and device
Technical Field
The invention relates to the technical field of AR (augmented reality), in particular to a method and a device for testing augmented reality illumination processing.
Background
At present, the main functions of various Augmented Reality technologies (AR for short) such as illumination estimation mostly pass through subjective feeling of manpower, and an objective precision test scheme is lacked. The illumination estimation is related information according to ambient light in the AR technical scheme, and a color correction function is provided for the camera image.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides an augmented reality illumination processing test method and device, which at least solve the technical problem that the AR display performance is not tested in the related technology.
According to an aspect of an embodiment of the present invention, there is provided an augmented reality illumination processing test method, including: respectively acquiring at least two real scene images corresponding to the augmented reality video image in at least two illumination parameter environments; respectively acquiring at least two synthetic scene images corresponding to an augmented reality synthetic video image in at least two illumination parameter environments, wherein the augmented reality synthetic video image is determined by setting a test object in the augmented reality video image; the augmented reality illumination process is tested using at least two real scene images and at least two composite scene images.
According to another aspect of the embodiments of the present invention, there is also provided an apparatus for testing an augmented reality AR display, including: the first acquisition module is used for respectively acquiring at least two real scene images corresponding to the augmented reality video image under the environment of at least two illumination parameters; the second acquisition module is used for respectively acquiring at least two synthetic scene images corresponding to the augmented reality synthetic video image in the environment of at least two illumination parameters, wherein the augmented reality synthetic video image is determined by arranging a test object in the augmented reality video image; and the testing module is used for testing the augmented reality illumination processing by utilizing at least two real scene images and at least two synthesized scene images.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium having a computer program stored therein, wherein the computer program is configured to perform the method of any one of the above when executed.
According to another aspect of embodiments of the present invention, there is also provided an electronic apparatus, including a memory in which a computer program is stored and a processor configured to execute the computer program to perform the method described in any one of the above.
In the embodiment of the invention, when the display performance of the AR is tested, at least two real scene images and at least two synthetic scene images are obtained under the environment of at least two illumination parameters; and then testing the processing performance of the AR on illumination based on the real scene image and the at least two synthesized scene images. The purpose of testing the AR display performance is achieved, the technical effect of accurately testing the AR display performance is achieved, and the technical problem that the AR display performance is not tested in the related technology is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a block diagram of a hardware structure of a mobile terminal for augmented reality illumination processing testing according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of an augmented reality illumination processing test method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of determining a test environment provided in accordance with an embodiment of the present invention;
FIG. 4 is a schematic diagram (one) of an AR image provided in accordance with an embodiment of the present invention;
FIG. 5 is a schematic diagram of an AR image provided in accordance with an embodiment of the present invention;
FIG. 6 is a schematic diagram (III) of an AR image provided in accordance with an embodiment of the present invention;
FIG. 7 is a schematic diagram (IV) of an AR image provided in accordance with an embodiment of the present invention;
FIG. 8 is a diagram of a difference map provided in accordance with an embodiment of the present invention;
FIG. 9 is a diagram of a difference map provided in accordance with an embodiment of the present invention;
FIG. 10 is a schematic diagram (III) of a difference map provided in accordance with an embodiment of the present invention;
FIG. 11 is a diagram illustrating a calculation manner of a difference map in the present embodiment;
fig. 12 is a schematic structural diagram of an augmented reality illumination processing testing apparatus according to an embodiment of the present invention;
fig. 13 is a graph of the effect of the test according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with an embodiment of the present invention, there is provided an embodiment of an augmented reality illumination processing test method, it should be noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions and that, although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order different than here.
The method provided by the embodiment of the invention can be executed in a mobile terminal, a computer terminal or a similar arithmetic device. Taking the example of running on a mobile terminal, fig. 1 is a block diagram of a hardware structure of the mobile terminal for an augmented reality illumination processing test according to an embodiment of the present invention. As shown in fig. 1, the mobile terminal 10 may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, and optionally may also include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration, and does not limit the structure of the mobile terminal. For example, the mobile terminal 10 may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a computer program, for example, a software program and a module of an application software, such as a computer program corresponding to the augmented reality lighting processing test method in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer program stored in the memory 104, so as to implement the method described above. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some instances, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the mobile terminal 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used for receiving or transmitting data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal 10. In one example, the transmission device 106 includes a Network adapter (NIC), which can be connected to other Network devices through a base station so as to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
Fig. 2 is a schematic flowchart of an augmented reality illumination processing testing method according to an embodiment of the present invention, and as shown in fig. 2, the method includes the following steps:
step S202, at least two real scene images corresponding to the augmented reality video image are respectively obtained under the environment of at least two illumination parameters;
step S204, at least two synthetic scene images corresponding to the augmented reality synthetic video image are respectively obtained under the environment of at least two illumination parameters, wherein the augmented reality synthetic video image is determined by arranging a test object in the augmented reality video image;
step S206, the augmented reality illumination processing is tested by utilizing at least two real scene images and at least two synthetic scene images.
Through the steps, when the display performance of the AR is tested, at least two real scene images and at least two synthetic scene images are obtained under the environment of at least two illumination parameters; and then testing the processing performance of the AR on illumination based on the real scene image and the at least two synthesized scene images. The purpose of testing the AR display performance is achieved, the technical effect of accurately testing the AR display performance is achieved, and the technical problem that the AR display performance is not tested in the related technology is solved.
In an alternative embodiment, before acquiring the real scene image and the composite scene image, a test environment of the acquired image needs to be determined, specifically, the test environment is determined by:
1 fixed test environment is prepared, as shown in fig. 3, in this embodiment, the object in the test environment is kept at a constant position during the test, the natural conditions of the test environment (such as light, temperature, air pressure, humidity, etc.) are kept constant, and then a fixed test target point (such as the position of the person shown in fig. 3) is selected in the test environment. The test environment may be a sphere with a center of a test target point (e.g., the object shown in fig. 3) and a radius R and an interior thereof, or an irregular space with a farthest distance R from any point in the test environment to the test site.
The augmented reality AR obtains information of a test environment through an input device (such as various cameras, sensors, recording devices, and the like) through a fixed flow (a fixed route and a fixed use method). The AR may accomplish an understanding of the test environment at this step, including but not limited to the size and location of various surfaces in the test environment, lighting conditions of the test environment, and the like.
It should be noted that the test environment in this embodiment is described by taking different illumination parameter environments as an example, but is not limited to this.
In an alternative embodiment, under different illumination parameter environments, the real scene image and the synthetic scene image are acquired by the following method:
and determining a target mark point in the first illumination parameter environment, and aligning the AR camera to the target mark point. The starting point is the coordinate where the AR camera is located, the end point is the vector of the coordinate where the test target point is located and is marked as (x, y, z), the obtained real scene image is shown in FIG. 4, and a first image is obtained. A test object, such as a small green android, is then placed in the first lighting parameter environment. The composite scene image obtained in the first illumination parameter environment is shown in fig. 6, and a third image is obtained.
The first lighting parameter environment is changed into the second lighting parameter environment by changing the lighting parameter in the first lighting parameter environment. And aligning the AR camera to the target mark point in the second illumination parameter environment. The starting point is the coordinate where the AR camera is located, the end point is the vector of the coordinate where the test target point is located and is marked as (x, y, z), the obtained real scene image is shown in FIG. 5, and a second image is obtained. The test object, e.g., android dolls, is then placed in a second lighting environment. The composite scene image obtained in the second illumination parameter environment is shown in fig. 7, resulting in a fourth image.
In an optional embodiment, after acquiring the real scene image and the composite scene image, the acquired images need to be tested and analyzed, which specifically includes the following contents:
the first image under the first illumination parameter environment and the second image under the second illumination parameter environment are subjected to pixel-by-pixel difference to obtain a first pixel difference value, that is, the first image and the second image are merged to obtain a first difference value map, as shown in fig. 9.
Furthermore, the extreme manner of the first difference map between the first image and the second image is shown in fig. 11.
And performing a difference value on the third image in the second illumination parameter environment and the fourth image in the second illumination parameter environment pixel by pixel to obtain a second pixel difference value, that is, combining the third image and the fourth image to obtain a second difference value map, as shown in fig. 8.
In an alternative embodiment, histogram statistics needs to be performed on the first difference map and the second difference map to obtain a histogram statistics result h (i), as shown in table 1 (when the color level N is 256), where table 1 is the histogram statistics result of the difference map:
table 1:
pixel difference value 0 1 2 ... 252 253 254 255
Number of pixels a0 a1 a2 ... a252 a253 a254 a255
As shown in table 1, the histogram statistical result includes a pixel difference and a pixel number corresponding to the pixel difference, specifically, a first pixel difference and a second pixel difference, a first pixel number corresponding to the first pixel difference, and a second pixel number corresponding to the second pixel difference.
In an optional embodiment, after obtaining the histogram statistical result, the difference value d of the histogram statistical result needs to be calculated, specifically: calculating a first difference value d1 of the first difference map with respect to the first image and the second image based on the first pixel difference value and the first pixel number; calculating a second difference d2 of the second difference map with respect to the third image and the fourth image based on the second pixel difference and the second pixel number; for example, the formula for d is expressed as follows:
Figure BDA0001832492500000061
then, normalization processing needs to be performed on the first difference value and the second difference value to obtain a normalized difference value.
In an alternative embodiment, if the target to be measured is an object with a single color, only the pixel proportion of the target to be measured in any image needs to be calculated, wherein any image comprises: a first image, a second image, a third image, and a fourth image; calculating a color correction difference value based on the normalized difference value and the pixel ratio; and comparing the color correction difference value with a preset augmented reality illumination display value to test the display performance of the augmented reality illumination processing.
However, it should be noted that, if the real scene image and the composite scene image are multi-color channel images (for example, RGB three channels), the pixel occupation ratios of the respective color channels of the target to be measured in any image need to be calculated (as shown in fig. 10), so as to obtain a plurality of color pixel occupation ratios p, where any image includes: a first image, a second image, a third image, and a fourth image; calculating a color correction difference value of each color pixel ratio in the multiple color pixel ratios based on the normalized difference value to obtain multiple color correction difference values; calculating a sum c of a plurality of color correction difference values; and comparing the sum with a preset augmented reality illumination display value to test the display performance of the augmented reality illumination. For example, the calculation formula may be:
Figure BDA0001832492500000062
it should be noted that the smaller the c value, the better the illumination estimation effect, and the worse the illumination estimation effect.
From the above, the test of the illumination estimation effect of the AR on the test object (target to be tested) in the first illumination parameter environment and the second illumination parameter environment at the viewing angle (x, y, z) has been completed. And the test of multiple test models in multiple test environments and multiple viewing angles can be carried out by expanding more viewing angles, test environments and test models.
The present embodiment is mainly used to test the effect of illumination estimation (Light estimation), i.e. the AR can detect the relevant information of its ambient Light and provide the average Light intensity and color correction of a given camera image. This information enables the virtual objects to be illuminated with the same illumination as the surrounding environment, increasing their realism, as shown in fig. 13, the best test effect is to make the image displayed by the AR blend into the real scene.
In addition, the overall illumination estimation effect of the AR may be obtained by averaging the results weighted according to the results under different viewing angles, different environments, and different test models.
In summary, the present embodiment further includes: the device comprises a test flow control module, a test image acquisition module and a test image analysis module. The concrete functions are as follows:
and the test flow control module is in charge of assisting input equipment and devices of the AR to understand the test environment.
And acquiring a test image, and acquiring an AR output result.
And the test image analysis module is used for recording the AR illumination estimation effect obtained through calculation and analysis according to the test image.
The beneficial effects of this embodiment are as follows: universality: the whole technical scheme has no requirements on hardware and a system platform of the AR, and does not need to know the source code or any implementation details of the AR. The accuracy is as follows: when the test environment is under constant conditions, the technical scheme is less influenced by other factors. And (3) expandability: the balance between the error and the workload of the technical scheme can be adjusted by selecting the number of the test points, and the more the test angles, the test environments and the test models are, the larger the workload is and the higher the precision is.
An embodiment of the present invention further provides an augmented reality illumination processing test apparatus, and fig. 12 is a schematic structural diagram of the augmented reality illumination processing test apparatus provided in the embodiment of the present invention, and as shown in fig. 12, the apparatus includes:
a first obtaining module 1202, configured to obtain at least two real scene images corresponding to an augmented reality video image in at least two illumination parameter environments, respectively;
a second obtaining module 1204, configured to obtain at least two synthesized scene images corresponding to the augmented reality synthesized video image in at least two illumination parameter environments, respectively, where the augmented reality synthesized video image is determined by setting a test object in the augmented reality video image;
a testing module 1206 for testing the augmented reality illumination processing using the at least two real scene images and the at least two composite scene images.
Through the steps, when the display performance of the AR is tested, at least two real scene images and at least two synthetic scene images are obtained under the environment of at least two illumination parameters; and then testing the processing performance of the AR on illumination based on the real scene image and the at least two synthesized scene images. The purpose of testing the AR display performance is achieved, the technical effect of accurately testing the AR display performance is achieved, and the technical problem that the AR display performance is not tested in the related technology is solved.
In an alternative embodiment, before acquiring the real scene image and the composite scene image, a test environment of the acquired image needs to be determined, specifically, the test environment is determined by:
1 fixed test environment is prepared, as shown in fig. 3, in this embodiment, the object in the test environment is kept at a constant position during the test, the natural conditions of the test environment (such as light, temperature, air pressure, humidity, etc.) are kept constant, and then a fixed test target point (such as the position of the person shown in fig. 3) is selected in the test environment. The test environment may be a sphere with a center of a test target point (e.g., the object shown in fig. 3) and a radius R and an interior thereof, or an irregular space with a farthest distance R from any point in the test environment to the test site.
The augmented reality AR obtains information of a test environment through an input device (such as various cameras, sensors, recording devices, and the like) through a fixed flow (a fixed route and a fixed use method). The AR may accomplish an understanding of the test environment at this step, including but not limited to the size and location of various surfaces in the test environment, lighting conditions of the test environment, and the like.
It should be noted that the test environment in this embodiment is described by taking different illumination parameter environments as an example, but is not limited to this.
In an alternative embodiment, under different illumination parameter environments, the real scene image and the synthetic scene image are acquired by the following method:
and determining a target mark point in the first illumination parameter environment, and aligning the AR camera to the target mark point. The starting point is the coordinate where the AR camera is located, the end point is the vector of the coordinate where the test target point is located and is marked as (x, y, z), the obtained real scene image is shown in FIG. 4, and a first image is obtained. A test object, such as a small green android, is then placed in the first lighting parameter environment. The composite scene image obtained in the first illumination parameter environment is shown in fig. 6, and a third image is obtained.
The first lighting parameter environment is changed into the second lighting parameter environment by changing the lighting parameter in the first lighting parameter environment. And aligning the AR camera to the target mark point in the second illumination parameter environment. The starting point is the coordinate where the AR camera is located, the end point is the vector of the coordinate where the test target point is located and is marked as (x, y, z), the obtained real scene image is shown in FIG. 5, and a second image is obtained. The test object, e.g., android dolls, is then placed in a second lighting environment. The composite scene image obtained in the second illumination parameter environment is shown in fig. 7, resulting in a fourth image.
In an optional embodiment, after acquiring the real scene image and the composite scene image, the acquired images need to be tested and analyzed, which specifically includes the following contents:
the first image under the first illumination parameter environment and the second image under the second illumination parameter environment are subjected to pixel-by-pixel difference to obtain a first pixel difference value, that is, the first image and the second image are merged to obtain a first difference value map, as shown in fig. 9.
Furthermore, the extreme manner of the first difference map between the first image and the second image is shown in fig. 11.
And performing a difference value on the third image in the second illumination parameter environment and the fourth image in the second illumination parameter environment pixel by pixel to obtain a second pixel difference value, that is, combining the third image and the fourth image to obtain a second difference value map, as shown in fig. 10.
In an alternative embodiment, histogram statistics needs to be performed on the first difference map and the second difference map to obtain a histogram statistics result h (i), as shown in table 1 (when the color level N is 256), where table 1 is the histogram statistics result of the difference map:
table 1:
pixel difference value 0 1 2 ... 252 253 254 255
Number of pixels a0 a1 a2 ... a252 a253 a254 a255
As shown in table 1, the histogram statistical result includes a pixel difference and a pixel number corresponding to the pixel difference, specifically, a first pixel difference and a second pixel difference, a first pixel number corresponding to the first pixel difference, and a second pixel number corresponding to the second pixel difference.
In an optional embodiment, after obtaining the histogram statistical result, the difference value d of the histogram statistical result needs to be calculated, specifically: calculating a first difference value d1 of the first difference map with respect to the first image and the second image based on the first pixel difference value and the first pixel number; calculating a second difference d2 of the second difference map with respect to the third image and the fourth image based on the second pixel difference and the second pixel number; for example, the formula for d is expressed as follows:
Figure BDA0001832492500000091
then, normalization processing needs to be performed on the first difference value and the second difference value to obtain a normalized difference value.
In an alternative embodiment, if the target to be measured is an object with a single color, only the pixel proportion of the target to be measured in any image needs to be calculated, wherein any image comprises: a first image, a second image, a third image, and a fourth image; calculating a color correction difference value based on the normalized difference value and the pixel ratio; and comparing the color correction difference value with a preset augmented reality illumination display value to test the display performance of the augmented reality illumination processing.
However, it should be noted that, if the real scene image and the composite scene image are multi-color channel images (for example, RGB three channels), the pixel ratios of the color channels of the target to be measured in any image need to be calculated to obtain a plurality of color pixel ratios p, where any image includes: a first image, a second image, a third image, and a fourth image; calculating a color correction difference value of each color pixel ratio in the multiple color pixel ratios based on the normalized difference value to obtain multiple color correction difference values; calculating a sum c of a plurality of color correction difference values; and comparing the sum with a preset augmented reality illumination display value to test the display performance of the augmented reality illumination. For example, the calculation formula may be:
Figure BDA0001832492500000101
it should be noted that the smaller the c value, the better the illumination estimation effect, and the worse the illumination estimation effect.
From the above, the test of the illumination estimation effect of the AR on the test object (target to be tested) in the first illumination parameter environment and the second illumination parameter environment at the viewing angle (x, y, z) has been completed. And the test of multiple test models in multiple test environments and multiple viewing angles can be carried out by expanding more viewing angles, test environments and test models.
The present embodiment is mainly used to test the effect of illumination estimation (Light estimation), i.e. the AR can detect the relevant information of its ambient Light and provide the average Light intensity and color correction of a given camera image. This information enables the virtual objects to be illuminated with the same illumination as the surrounding environment, increasing their realism, as shown in fig. 13, the best test effect is to make the image displayed by the AR blend into the real scene.
Embodiments of the present invention also provide a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
s1, respectively acquiring at least two real scene images corresponding to the augmented reality video image under the environment of at least two illumination parameters;
s2, acquiring at least two synthetic scene images corresponding to the augmented reality synthetic video image under at least two illumination parameter environments respectively, wherein the augmented reality synthetic video image is determined by arranging a test object in the augmented reality video image;
s3, testing augmented reality illumination processing using at least two real scene images and at least two composite scene images.
Optionally, in this embodiment, the storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, respectively acquiring at least two real scene images corresponding to the augmented reality video image under the environment of at least two illumination parameters;
s2, acquiring at least two synthetic scene images corresponding to the augmented reality synthetic video image under at least two illumination parameter environments respectively, wherein the augmented reality synthetic video image is determined by arranging a test object in the augmented reality video image;
s3, testing augmented reality illumination processing using at least two real scene images and at least two composite scene images.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments and optional implementation manners, and this embodiment is not described herein again.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. An augmented reality illumination processing test method, comprising:
respectively acquiring at least two real scene images corresponding to the augmented reality video image in at least two illumination parameter environments;
respectively acquiring at least two synthetic scene images corresponding to an augmented reality synthetic video image in the environment of the at least two illumination parameters, wherein the augmented reality synthetic video image is determined by setting a test object in the augmented reality video image;
and testing the augmented reality illumination processing by using the at least two real scene images and the at least two synthetic scene images.
2. The method according to claim 1, wherein when testing the augmented reality illumination processing using the at least two real scene images and the at least two synthetic scene images, the method comprises:
calculating a first pixel difference value between any first image and any second image in the at least two real scene images to obtain a first difference value image;
calculating a second pixel difference value between any third image and a fourth image in the at least two synthesized scene images to obtain a second difference value image;
the first image and the third image are in a first illumination parameter environment, the second image and the fourth image are in a second illumination parameter environment, and the first illumination parameter environment and the second illumination parameter environment have different illumination parameters.
3. The method of claim 2, wherein after obtaining the first difference map and the second difference map, the method further comprises:
performing histogram statistics on the first difference image and the second difference image to obtain a histogram statistical result;
wherein the histogram statistical result comprises: the first pixel difference value, the second pixel difference value, a first pixel number corresponding to the first pixel difference value, and a second pixel number corresponding to the second pixel difference value.
4. The method of claim 3, wherein after obtaining the histogram statistics, the method further comprises:
calculating a first difference value of the first difference map relative to the first image and the second image based on the first pixel difference value and the first pixel number;
calculating a second difference value of the second difference map with respect to the third image and the fourth image based on the second pixel difference value and the second pixel number;
and normalizing the first difference value and the second difference value to obtain a normalized difference value.
5. The method of claim 4, wherein testing the augmented reality illumination process using the at least two real scene images and the at least two composite scene images comprises:
calculating the pixel proportion of a test object in any image, wherein the any image is one of the following images: the first image, the second image, the third image, and the fourth image;
calculating a color correction difference value based on the normalized difference value and the pixel ratio;
and comparing the color correction difference value with a preset augmented reality illumination display value to test the display performance of the augmented reality illumination processing.
6. The method of claim 4, wherein testing the augmented reality illumination process using the at least two real scene images and the at least two composite scene images when the real scene images and the composite scene images are multi-color channel images comprises:
calculating the pixel proportion of each color channel of a test object in any image to obtain a plurality of color pixel proportions, wherein any image comprises: the first image, the second image, the third image, and the fourth image;
calculating a color correction difference value of each color pixel ratio in the plurality of color pixel ratios based on the normalized difference value to obtain a plurality of color correction difference values;
calculating a sum of the plurality of color correction difference values;
and comparing the sum with a preset augmented reality illumination display value to test the display performance of the augmented reality illumination.
7. An augmented reality illumination processing testing apparatus, comprising:
the first acquisition module is used for respectively acquiring at least two real scene images corresponding to the augmented reality video image under the environment of at least two illumination parameters;
a second obtaining module, configured to obtain at least two synthesized scene images corresponding to an augmented reality synthesized video image in the at least two illumination parameter environments, respectively, where the augmented reality synthesized video image is determined by setting a test object in the augmented reality video image;
and the testing module is used for testing the augmented reality illumination processing by utilizing the at least two real scene images and the at least two synthetic scene images.
8. The apparatus of claim 7, further comprising:
a first determining module, configured to calculate a first pixel difference between any first image and any second image of the at least two real scene images before testing the augmented reality illumination processing using the at least two real scene images and the at least two composite scene images, so as to obtain a first difference map;
the second determining module is used for calculating a second pixel difference value between any third image and a fourth image in the at least two synthesized scene images to obtain a second difference value image;
the first image and the third image are in a first illumination parameter environment, the second image and the fourth image are in a second illumination parameter environment, and the first illumination parameter environment and the second illumination parameter environment have different illumination parameters.
9. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 6 when executed.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 6.
CN201811211184.0A 2018-10-17 2018-10-17 Augmented reality illumination processing test method and device Active CN109461145B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811211184.0A CN109461145B (en) 2018-10-17 2018-10-17 Augmented reality illumination processing test method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811211184.0A CN109461145B (en) 2018-10-17 2018-10-17 Augmented reality illumination processing test method and device

Publications (2)

Publication Number Publication Date
CN109461145A CN109461145A (en) 2019-03-12
CN109461145B true CN109461145B (en) 2021-03-23

Family

ID=65607865

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811211184.0A Active CN109461145B (en) 2018-10-17 2018-10-17 Augmented reality illumination processing test method and device

Country Status (1)

Country Link
CN (1) CN109461145B (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS54115808A (en) * 1978-02-25 1979-09-08 Japanese National Railways<Jnr> Train centralized centrol board and method of controlling display thereof
WO2012018149A1 (en) * 2010-08-06 2012-02-09 Bizmodeline Co., Ltd. Apparatus and method for augmented reality
JP5872923B2 (en) * 2012-02-22 2016-03-01 株式会社マイクロネット AR image processing apparatus and method
GB2502591B (en) * 2012-05-31 2014-04-30 Sony Comp Entertainment Europe Apparatus and method for augmenting a video image

Also Published As

Publication number Publication date
CN109461145A (en) 2019-03-12

Similar Documents

Publication Publication Date Title
CN108470334B (en) Method and device for collecting screen brightness and chromaticity
CN108229288B (en) Neural network training and clothes color detection method and device, storage medium and electronic equipment
CN110189329B (en) System and method for locating patch regions of a color chip
CN105959661B (en) A kind of color temperature estimation method and electronic equipment
CN112669758B (en) Display screen correction method, device, system and computer readable storage medium
CN113299213A (en) Crease detection method and device
CN106303354A (en) A kind of face specially good effect recommends method and electronic equipment
CN112689140B (en) White balance synchronization method and device, electronic equipment and storage medium
CN111385461B (en) Panoramic shooting method and device, camera and mobile terminal
CN114972645A (en) Three-dimensional reconstruction method and device, computer equipment and storage medium
CN109345560B (en) Motion tracking precision testing method and device of augmented reality equipment
CN109427041A (en) A kind of image white balance method and system, storage medium and terminal device
CN109461145B (en) Augmented reality illumination processing test method and device
CN111899349B (en) Model presentation method and device, electronic equipment and computer storage medium
CN105574844B (en) Rdaiation response Function Estimation method and apparatus
CN113609907A (en) Method, device and equipment for acquiring multispectral data
CN110673114B (en) Method and device for calibrating depth of three-dimensional camera, computer device and storage medium
CN110163922B (en) Fisheye camera calibration system, fisheye camera calibration method, fisheye camera calibration device, electronic equipment and storage medium
CN111083460A (en) Illuminance testing method, device, equipment and medium based on ultra-short focus projection module
CN107734324B (en) Method and system for measuring illumination uniformity of flash lamp and terminal equipment
CN112016621A (en) Training method of classification model, color classification method and electronic equipment
CN108519215B (en) Pupil distance adaptability test system and method and test host
CN113870163B (en) Video fusion method and device based on three-dimensional scene, storage medium and electronic device
CN110853087B (en) Parallax estimation method, device, storage medium and terminal
CN113938674B (en) Video quality detection method, device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant