CN109461145A - Augmented reality lighting process test method and device - Google Patents
Augmented reality lighting process test method and device Download PDFInfo
- Publication number
- CN109461145A CN109461145A CN201811211184.0A CN201811211184A CN109461145A CN 109461145 A CN109461145 A CN 109461145A CN 201811211184 A CN201811211184 A CN 201811211184A CN 109461145 A CN109461145 A CN 109461145A
- Authority
- CN
- China
- Prior art keywords
- image
- augmented reality
- test
- difference
- illumination
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
Abstract
The invention discloses a kind of augmented reality lighting process test method and devices.Wherein, this method comprises: obtaining the corresponding at least two real scene images of augmented reality video image respectively under at least two illumination parameter environment;Corresponding at least two opening and closing of augmented reality composite video image are obtained respectively under at least two illumination parameter environment into scene image, wherein augmented reality composite video image is determined by the way that test object is arranged in augmented reality video image;Augmented reality lighting process is tested at scene image using at least two real scene images and at least two opening and closing.The present invention solves the technical issues of lacking test AR display performance in the related technology.
Description
Technical field
The present invention relates to AR technical fields, in particular to a kind of augmented reality lighting process test method and device.
Background technique
The main functionality such as illumination estimation of current various augmented realities (Augmented Reality, referred to as AR)
Mostly by the subjective feeling of manpower, lack objective precision testing scheme.Illumination estimation is in AR technical solution according to ring
The relevant information of border light, and the function of colour correction is provided for camera image.
For above-mentioned problem, currently no effective solution has been proposed.
Summary of the invention
The embodiment of the invention provides a kind of augmented reality lighting process test method and devices, at least to solve related skill
The technical issues of lacking test AR display performance in art.
According to an aspect of an embodiment of the present invention, a kind of augmented reality lighting process test method is provided, comprising:
The corresponding at least two real scene images of augmented reality video image are obtained under at least two illumination parameter environment respectively;Extremely
Corresponding at least two opening and closing of augmented reality composite video image are obtained under few two illumination parameter environment respectively into scene image,
In, augmented reality composite video image is determined by the way that test object is arranged in augmented reality video image;Using at least
Two real scene images and at least two opening and closing test augmented reality lighting process at scene image.
According to another aspect of an embodiment of the present invention, a kind of device that test augmented reality AR is shown is additionally provided,
It include: the first acquisition module, it is corresponding for obtaining augmented reality video image respectively under at least two illumination parameter environment
At least two real scene images;Second obtains module, existing for obtaining enhancing respectively under at least two illumination parameter environment
Corresponding at least two opening and closing of real composite video image are at scene image, wherein augmented reality composite video image is by increasing
Setting test object determines in strong reality video image;Test module, for using at least two real scene images and extremely
Few two opening and closing test augmented reality lighting process at scene image.
According to another aspect of an embodiment of the present invention, a kind of storage medium is additionally provided, is stored in the storage medium
Computer program, wherein the computer program is arranged to execute method described in any of the above-described when operation.
According to another aspect of an embodiment of the present invention, a kind of electronic device, including memory and processor, institute are additionally provided
It states and is stored with computer program in memory, the processor is arranged to run the computer program any of the above-described to execute
Method described in.
In embodiments of the present invention, when the display performance to AR is tested, under at least two illumination parameter environment
At least two real scene images and at least two opening and closing are obtained into scene image;It is then based on real scene image and at least two
Scene image test AR is synthesized to the process performance of illumination.Achieve the purpose that test AR display performance, to realize accurately
The technical effect of AR display performance is tested, and then solves the technical issues of lacking test AR display performance in the related technology.
Detailed description of the invention
The drawings described herein are used to provide a further understanding of the present invention, constitutes part of this application, this hair
Bright illustrative embodiments and their description are used to explain the present invention, and are not constituted improper limitations of the present invention.In the accompanying drawings:
Fig. 1 is the hardware block diagram of the mobile terminal of the augmented reality lighting process test of the embodiment of the present invention;
Fig. 2 is the flow diagram of the augmented reality lighting process test method provided according to embodiments of the present invention;
Fig. 3 is the schematic diagram in the determination test wrapper border provided according to embodiments of the present invention;
Fig. 4 is the schematic diagram (one) of the AR image provided according to embodiments of the present invention;
Fig. 5 is the schematic diagram (two) of the AR image provided according to embodiments of the present invention;
Fig. 6 is the schematic diagram (three) of the AR image provided according to embodiments of the present invention;
Fig. 7 is the schematic diagram (four) of the AR image provided according to embodiments of the present invention;
Fig. 8 is the schematic diagram (one) of the differential chart provided according to embodiments of the present invention;
Fig. 9 is the schematic diagram (two) of the differential chart provided according to embodiments of the present invention;
Figure 10 is the schematic diagram (three) of the differential chart provided according to embodiments of the present invention;
Figure 11 is the calculation schematic diagram of the differential chart in the present embodiment;
Figure 12 is the structural schematic diagram of the augmented reality lighting process test device provided according to embodiments of the present invention;
Figure 13 is test effect figure according to an embodiment of the present invention.
Specific embodiment
In order to enable those skilled in the art to better understand the solution of the present invention, below in conjunction in the embodiment of the present invention
Attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is only
The embodiment of a part of the invention, instead of all the embodiments.Based on the embodiments of the present invention, ordinary skill people
The model that the present invention protects all should belong in member's every other embodiment obtained without making creative work
It encloses.
It should be noted that description and claims of this specification and term " first " in above-mentioned attached drawing, "
Two " etc. be to be used to distinguish similar objects, without being used to describe a particular order or precedence order.It should be understood that using in this way
Data be interchangeable under appropriate circumstances, so as to the embodiment of the present invention described herein can in addition to illustrating herein or
Sequence other than those of description is implemented.In addition, term " includes " and " having " and their any deformation, it is intended that cover
Cover it is non-exclusive include, for example, the process, method, system, product or equipment for containing a series of steps or units are not necessarily limited to
Step or unit those of is clearly listed, but may include be not clearly listed or for these process, methods, product
Or other step or units that equipment is intrinsic.
According to embodiments of the present invention, a kind of embodiment of augmented reality lighting process test method is provided, needs to illustrate
, step shown in the flowchart of the accompanying drawings can hold in a computer system such as a set of computer executable instructions
Row, although also, logical order is shown in flow charts, and it in some cases, can be to be different from sequence herein
Execute shown or described step.
Embodiment of the method provided by the embodiment of the present invention can be in mobile terminal, terminal or similar operation
It is executed in device.For running on mobile terminals, Fig. 1 is the shifting of the augmented reality lighting process test of the embodiment of the present invention
The hardware block diagram of dynamic terminal.As shown in Figure 1, mobile terminal 10 may include one or more (only showing one in Fig. 1)
(processor 102 can include but is not limited to the processing dress of Micro-processor MCV or programmable logic device FPGA etc. to processor 102
Set) and memory 104 for storing data, optionally, above-mentioned mobile terminal can also include the transmission for communication function
Equipment 106 and input-output equipment 108.It will appreciated by the skilled person that structure shown in FIG. 1 is only to illustrate,
It does not cause to limit to the structure of above-mentioned mobile terminal.For example, mobile terminal 10 may also include it is more than shown in Fig. 1 or
Less component, or with the configuration different from shown in Fig. 1.
Memory 104 can be used for storing computer program, for example, the software program and module of application software, such as this hair
The corresponding computer program of augmented reality lighting process test method in bright embodiment, processor 102 are stored in by operation
Computer program in memory 104 realizes above-mentioned method thereby executing various function application and data processing.It deposits
Reservoir 104 may include high speed random access memory, may also include nonvolatile memory, as one or more magnetic storage fills
It sets, flash memory or other non-volatile solid state memories.In some instances, memory 104 can further comprise relative to place
The remotely located memory of device 102 is managed, these remote memories can pass through network connection to mobile terminal 10.Above-mentioned network
Example includes but is not limited to internet, intranet, local area network, mobile radio communication and combinations thereof.
Transmitting device 106 is used to that data to be received or sent via a network.Above-mentioned network specific example may include
The wireless network that the communication providers of mobile terminal 10 provide.In an example, transmitting device 106 includes a Network adaptation
Device (Network Interface Controller, referred to as NIC), can be connected by base station with other network equipments to
It can be communicated with internet.In an example, transmitting device 106 can for radio frequency (Radio Frequency, referred to as
RF) module is used to wirelessly be communicated with internet.
Fig. 2 is the flow diagram of the augmented reality lighting process test method provided according to embodiments of the present invention, such as Fig. 2
Shown, this method comprises the following steps:
Step S202 obtains augmented reality video image corresponding at least two respectively under at least two illumination parameter environment
Open real scene image;
Step S204, it is corresponding extremely under at least two illumination parameter environment to obtain augmented reality composite video image respectively
Few two opening and closing are at scene image, wherein augmented reality composite video image is surveyed by being arranged in augmented reality video image
Try what object determined;
Step S206, using at least two real scene images and at least two opening and closing at scene image to augmented reality illumination
Processing is tested.
Through the above steps, it when the display performance to AR is tested, is obtained under at least two illumination parameter environment
At least two real scene images and at least two opening and closing are at scene image;Be then based on real scene image and at least two opening and closing at
Scene image tests AR to the process performance of illumination.Achieve the purpose that test AR display performance, to realize accurate test
The technical effect of AR display performance, and then solve the technical issues of lacking test AR display performance in the related technology.
In an alternative embodiment, it before obtaining real scene image and synthesis scene image, needs first to determine
The test environment for obtaining image determines test environment especially by following manner:
Prepare 1 fixed test environment, as shown in figure 3, in the present embodiment test environment in object in testing
All holding position is constant, and the natural conditions (such as illumination, temperature, air pressure, humidity) for testing environment are kept constant, and is then testing
A fixed test target point (position of personage as shown in Figure 3) is selected in environment.Test environment can be one to survey
Trying target point (object as shown in Figure 3) is the centre of sphere, and ball and its inside that radius is R are also possible to irregular, test wrapper
The space that any point is R to testing location maximum distance in border.
Allow augmented reality AR by input equipment (such as various cameras, sensor, recording device) by fixed stream
Journey (fixed route and fixed application method) obtains the information of test environment.AR can be completed in the step to test wrapper
The understanding in border, the size and location on all kinds of surfaces, tests the information such as the illumination condition of environment including but not limited in test environment.
It should be noted that the test environment in the present embodiment is illustrated by taking different illumination parameter environment as an example, but
It is without being limited thereto.
In an alternative embodiment, under different illumination parameter environment, real scene is obtained in the following manner
Image and synthesis scene image:
Target label point is determined in the first illumination parameter environment, by AR camera alignment target mark point.Starting point is AR
Coordinate where camera, terminal are that the vector of coordinate where test target point is denoted as (x, y, z), and the real scene image of acquisition is such as
Shown in Fig. 4, the first image is obtained.Then test object, such as the small green people of Android are placed in the first illumination parameter environment.?
The synthesis scene image obtained in one illumination parameter environment is as shown in fig. 6, obtain third image.
By changing the illumination parameter in the first illumination parameter environment, the first illumination parameter environment is made to become the second illumination ginseng
Ring of numbers border.In the second illumination parameter environment, by AR camera alignment target mark point.Starting point is coordinate where AR camera,
The vector of coordinate where terminal is test target point is denoted as (x, y, z), and the real scene image of acquisition is as shown in figure 5, obtain the
Two images.Then test object, such as the small green people of Android are placed in the second light environment.It is obtained in the second illumination parameter environment
The synthesis scene image taken is as shown in fig. 7, obtain the 4th image.
In an alternative embodiment, it after obtaining real scene image and synthesis scene image, needs to acquisition
Image carries out test analysis, specifically includes the following contents:
Second image pixel by pixel under first image under first illumination parameter environment and the second illumination parameter environment is done into difference,
The first pixel value difference is obtained, is to merge the first image and the second image to obtain the first differential chart, as shown in Figure 9.
In addition, the extreme manner of the first differential chart between the first image and the second image is as shown in figure 11.
4th image pixel by pixel under third image under second illumination parameter environment and the second illumination parameter environment is done into difference,
The second pixel value difference is obtained, is to merge third image and the 4th image to obtain the second differential chart, as shown in Figure 8.
In an alternative embodiment, it needs to do statistics with histogram to the first differential chart and the second differential chart, obtain straight
Square figure statistical result H (i), as shown in table 1 (when the color range N=256 the case where), table 1 is the statistics with histogram result of differential chart:
Table 1:
Pixel value difference | 0 | 1 | 2 | ... | 252 | 253 | 254 | 255 |
Pixel number | a0 | a1 | a2 | ... | a252 | a253 | a254 | a255 |
As shown in table 1, including pixel value difference and pixel number corresponding with pixel value difference in statistics with histogram result, specially
First pixel value difference, the second pixel value difference ..., the first pixel number corresponding with the first pixel value difference, with the second pixel value difference pair
The second pixel number ... the answered
In an alternative embodiment, after obtaining statistics with histogram result, need to calculate statistics with histogram result
Difference value d, specifically: the first differential chart is calculated relative to the first image and based on the first pixel value difference and the first pixel number
First difference value d1 of two images;The second differential chart is calculated relative to third figure based on the second pixel value difference and the second pixel number
Second difference value d2 of picture and the 4th image;For example, the formula of d is expressed as follows:
Then, it needs that the first difference value and the second difference value is normalized, obtains normalized difference value.
In an alternative embodiment, if object to be measured is the object of a single color, it is only necessary to calculate to be measured
Pixel accounting of the target in any image, wherein any image includes: the first image, the second image, third image and the 4th
Image;Colour correction difference value is calculated based on normalized difference value and pixel accounting;By colour correction difference value with it is preset
Augmented reality illumination show value is compared, to test the display performance of augmented reality lighting process.
But it should be recognized that if real scene image and synthesis scene image be multicolour channel image (for example,
RGB triple channel), then need to calculate the pixel accounting of each color channel of the object to be measured in any image (such as Figure 10 institute
Show), obtain multiple color pixel accounting p, wherein any image includes: the first image, the second image, third image and the 4th
Image;The colour correction difference of each color pixel accounting in multiple color pixel accountings is calculated based on normalized difference value
Value, obtains multiple colour correction difference values;Calculate multiple colour correction difference values and value c;It will be existing with value and preset enhancing
Real illumination show value is compared, to test the display performance of augmented reality illumination.For example, calculation formula can be with are as follows:
It should be noted that the smaller then illumination estimation effect of c value is better, it is on the contrary then poorer.
It can be seen from the above, having been completed AR in visual angle (x, y, z) first illumination parameter environment and the second illumination parameter ring
Test of the border to the illumination estimation effect of test object (object to be measured).Can also by extend more visual angles, test environment,
Test model carries out the test that multi-angle of view tests the more test models of environment more.
The present embodiment is mainly used for testing the effect of illumination estimation (Light estimation), i.e. AR can detecte its ring
The relevant information of border light, and average luminous intensity and the colour correction of given camera image are provided.This information be able to use with
Dummy object is illuminated in the identical illumination of ambient enviroment, promotes their sense of reality, as shown in figure 13, optimal test effect is
The image that AR is shown incorporates in real scene.
In addition, the global illumination estimation effect of AR can be according under different perspectives, varying environment and different test mould
Result in the case where type asks weighted average to obtain result.
In conclusion the present embodiment further include: testing process control module, test image acquisition module, test image point
Analyse module.Specific effect is as follows:
Testing process control module, the input equipment and device for being responsible for auxiliary AR carry out test environment understanding.
Test image acquisition is responsible for obtaining AR output result.
Test image analysis module, record obtain AR illumination estimation effect according to test image, by calculating analysis.
The present embodiment has the beneficial effect that: versatility: for the hardware of AR, system platform all do not have entire technical solution
Any requirement, also require no knowledge about AR source code or any realization details.Accuracy: when test environment is in controlled condition
When lower, which is influenced smaller by other factors.Scalability: the tradeoff of the error and workload of technical solution can be with
By take test point number be adjusted, test angle, test environment, test model more more then workloads are bigger and precision
It is higher.
The embodiment of the invention also provides a kind of augmented reality lighting process test device, Figure 12 is to implement according to the present invention
The structural schematic diagram for the augmented reality lighting process test device that example provides, as shown in figure 12, which includes:
First obtains module 1202, for obtaining augmented reality video image respectively under at least two illumination parameter environment
Corresponding at least two real scene images;
Second obtains module 1204, for obtaining augmented reality synthetic video respectively under at least two illumination parameter environment
Corresponding at least two opening and closing of image are at scene image, wherein augmented reality composite video image is by augmented reality video
It is arranged what test object determined in image;
Test module 1206, for utilizing at least two real scene images and at least two opening and closing at scene image to enhancing
Real lighting process is tested.
Through the above steps, it when the display performance to AR is tested, is obtained under at least two illumination parameter environment
At least two real scene images and at least two opening and closing are at scene image;Be then based on real scene image and at least two opening and closing at
Scene image tests AR to the process performance of illumination.Achieve the purpose that test AR display performance, to realize accurate test
The technical effect of AR display performance, and then solve the technical issues of lacking test AR display performance in the related technology.
In an alternative embodiment, it before obtaining real scene image and synthesis scene image, needs first to determine
The test environment for obtaining image determines test environment especially by following manner:
Prepare 1 fixed test environment, as shown in figure 3, in the present embodiment test environment in object in testing
All holding position is constant, and the natural conditions (such as illumination, temperature, air pressure, humidity) for testing environment are kept constant, and is then testing
A fixed test target point (position of personage as shown in Figure 3) is selected in environment.Test environment can be one to survey
Trying target point (object as shown in Figure 3) is the centre of sphere, and ball and its inside that radius is R are also possible to irregular, test wrapper
The space that any point is R to testing location maximum distance in border.
Allow augmented reality AR by input equipment (such as various cameras, sensor, recording device) by fixed stream
Journey (fixed route and fixed application method) obtains the information of test environment.AR can be completed in the step to test wrapper
The understanding in border, the size and location on all kinds of surfaces, tests the information such as the illumination condition of environment including but not limited in test environment.
It should be noted that the test environment in the present embodiment is illustrated by taking different illumination parameter environment as an example, but
It is without being limited thereto.
In an alternative embodiment, under different illumination parameter environment, real scene is obtained in the following manner
Image and synthesis scene image:
Target label point is determined in the first illumination parameter environment, by AR camera alignment target mark point.Starting point is AR
Coordinate where camera, terminal are that the vector of coordinate where test target point is denoted as (x, y, z), and the real scene image of acquisition is such as
Shown in Fig. 4, the first image is obtained.Then test object, such as the small green people of Android are placed in the first illumination parameter environment.?
The synthesis scene image obtained in one illumination parameter environment is as shown in fig. 6, obtain third image.
By changing the illumination parameter in the first illumination parameter environment, the first illumination parameter environment is made to become the second illumination ginseng
Ring of numbers border.In the second illumination parameter environment, by AR camera alignment target mark point.Starting point is coordinate where AR camera,
The vector of coordinate where terminal is test target point is denoted as (x, y, z), and the real scene image of acquisition is as shown in figure 5, obtain the
Two images.Then test object, such as the small green people of Android are placed in the second light environment.It is obtained in the second illumination parameter environment
The synthesis scene image taken is as shown in fig. 7, obtain the 4th image.
In an alternative embodiment, it after obtaining real scene image and synthesis scene image, needs to acquisition
Image carries out test analysis, specifically includes the following contents:
Second image pixel by pixel under first image under first illumination parameter environment and the second illumination parameter environment is done into difference,
The first pixel value difference is obtained, is to merge the first image and the second image to obtain the first differential chart, as shown in Figure 9.
In addition, the extreme manner of the first differential chart between the first image and the second image is as shown in figure 11.
4th image pixel by pixel under third image under second illumination parameter environment and the second illumination parameter environment is done into difference,
The second pixel value difference is obtained, is to merge third image and the 4th image to obtain the second differential chart, as shown in Figure 10.
In an alternative embodiment, it needs to do statistics with histogram to the first differential chart and the second differential chart, obtain straight
Square figure statistical result H (i), as shown in table 1 (when the color range N=256 the case where), table 1 is the statistics with histogram result of differential chart:
Table 1:
Pixel value difference | 0 | 1 | 2 | ... | 252 | 253 | 254 | 255 |
Pixel number | a0 | a1 | a2 | ... | a252 | a253 | a254 | a255 |
As shown in table 1, including pixel value difference and pixel number corresponding with pixel value difference in statistics with histogram result, specially
First pixel value difference, the second pixel value difference ..., the first pixel number corresponding with the first pixel value difference, with the second pixel value difference pair
The second pixel number ... the answered
In an alternative embodiment, after obtaining statistics with histogram result, need to calculate statistics with histogram result
Difference value d, specifically: the first differential chart is calculated relative to the first image and based on the first pixel value difference and the first pixel number
First difference value d1 of two images;The second differential chart is calculated relative to third figure based on the second pixel value difference and the second pixel number
Second difference value d2 of picture and the 4th image;For example, the formula of d is expressed as follows:
Then, it needs that the first difference value and the second difference value is normalized, obtains normalized difference value.
In an alternative embodiment, if object to be measured is the object of a single color, it is only necessary to calculate to be measured
Pixel accounting of the target in any image, wherein any image includes: the first image, the second image, third image and the 4th
Image;Colour correction difference value is calculated based on normalized difference value and pixel accounting;By colour correction difference value with it is preset
Augmented reality illumination show value is compared, to test the display performance of augmented reality lighting process.
But it should be recognized that if real scene image and synthesis scene image be multicolour channel image (for example,
RGB triple channel), then it needs to calculate the pixel accounting of each color channel of the object to be measured in any image, obtains multiple colors
Polychrome element accounting p, wherein any image includes: the first image, the second image, third image and the 4th image;Based on normalizing
Change the colour correction difference value that difference value calculates each color pixel accounting in multiple color pixel accountings, obtains multiple colors
Color correction difference value;Calculate multiple colour correction difference values and value c;Will with value with preset augmented reality illumination show value into
Row compares, to test the display performance of augmented reality illumination.For example, calculation formula can be with are as follows:
It should be noted that the smaller then illumination estimation effect of c value is better, it is on the contrary then poorer.
It can be seen from the above, having been completed AR in visual angle (x, y, z) first illumination parameter environment and the second illumination parameter ring
Test of the border to the illumination estimation effect of test object (object to be measured).Can also by extend more visual angles, test environment,
Test model carries out the test that multi-angle of view tests the more test models of environment more.
The present embodiment is mainly used for testing the effect of illumination estimation (Light estimation), i.e. AR can detecte its ring
The relevant information of border light, and average luminous intensity and the colour correction of given camera image are provided.This information be able to use with
Dummy object is illuminated in the identical illumination of ambient enviroment, promotes their sense of reality, as shown in figure 13, optimal test effect is
The image that AR is shown incorporates in real scene.
The embodiments of the present invention also provide a kind of storage medium, computer program is stored in the storage medium, wherein
The computer program is arranged to execute the step in any of the above-described embodiment of the method when operation.
Optionally, in the present embodiment, above-mentioned storage medium can be set to store by executing based on following steps
Calculation machine program:
S1, obtained respectively under at least two illumination parameter environment augmented reality video image it is corresponding at least two it is true
Scene image;
It is at least two corresponding to obtain augmented reality composite video image respectively under at least two illumination parameter environment by S2
Synthesize scene image, wherein augmented reality composite video image is by the way that test object is arranged in augmented reality video image
Determining;
S3, using at least two real scene images and at least two opening and closing at scene image to augmented reality lighting process into
Row test.
Optionally, in the present embodiment, above-mentioned storage medium can include but is not limited to: USB flash disk, read-only memory (Read-
Only Memory, referred to as ROM), it is random access memory (Random Access Memory, referred to as RAM), mobile hard
The various media that can store computer program such as disk, magnetic or disk.
The embodiments of the present invention also provide a kind of electronic device, including memory and processor, stored in the memory
There is computer program, which is arranged to run computer program to execute the step in any of the above-described embodiment of the method
Suddenly.
Optionally, above-mentioned electronic device can also include transmission device and input-output equipment, wherein the transmission device
It is connected with above-mentioned processor, which connects with above-mentioned processor.
Optionally, in the present embodiment, above-mentioned processor can be set to execute following steps by computer program:
S1, obtained respectively under at least two illumination parameter environment augmented reality video image it is corresponding at least two it is true
Scene image;
It is at least two corresponding to obtain augmented reality composite video image respectively under at least two illumination parameter environment by S2
Synthesize scene image, wherein augmented reality composite video image is by the way that test object is arranged in augmented reality video image
Determining;
S3, using at least two real scene images and at least two opening and closing at scene image to augmented reality lighting process into
Row test.
Optionally, the specific example in the present embodiment can be with reference to described in above-described embodiment and optional embodiment
Example, details are not described herein for the present embodiment.
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments.
In the above embodiment of the invention, it all emphasizes particularly on different fields to the description of each embodiment, does not have in some embodiment
The part of detailed description, reference can be made to the related descriptions of other embodiments.
In several embodiments provided herein, it should be understood that disclosed technology contents can pass through others
Mode is realized.Wherein, the apparatus embodiments described above are merely exemplary, such as the division of the unit, Ke Yiwei
A kind of logical function partition, there may be another division manner in actual implementation, for example, multiple units or components can combine or
Person is desirably integrated into another system, or some features can be ignored or not executed.Another point, shown or discussed is mutual
Between coupling, direct-coupling or communication connection can be through some interfaces, the INDIRECT COUPLING or communication link of unit or module
It connects, can be electrical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple
On unit.It can some or all of the units may be selected to achieve the purpose of the solution of this embodiment according to the actual needs.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit
It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list
Member both can take the form of hardware realization, can also realize in the form of software functional units.
If the integrated unit is realized in the form of SFU software functional unit and sells or use as independent product
When, it can store in a computer readable storage medium.Based on this understanding, technical solution of the present invention is substantially
The all or part of the part that contributes to existing technology or the technical solution can be in the form of software products in other words
It embodies, which is stored in a storage medium, including some instructions are used so that a computer
Equipment (can for personal computer, server or network equipment etc.) execute each embodiment the method for the present invention whole or
Part steps.And storage medium above-mentioned includes: that USB flash disk, read-only memory (ROM, Read-Only Memory), arbitrary access are deposited
Reservoir (RAM, Random Access Memory), mobile hard disk, magnetic or disk etc. be various to can store program code
Medium.
The above is only a preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art
For member, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications are also answered
It is considered as protection scope of the present invention.
Claims (10)
1. a kind of augmented reality lighting process test method characterized by comprising
Obtain the corresponding at least two real scene figures of augmented reality video image respectively under at least two illumination parameter environment
Picture;
Obtain corresponding at least two opening and closing of augmented reality composite video image respectively under at least two illumination parameters environment
At scene image, wherein the augmented reality composite video image is surveyed by being arranged in the augmented reality video image
Try what object determined;
Using at least two real scene images and at least two opening and closing at scene image to the augmented reality illumination
Processing is tested.
2. the method according to claim 1, wherein using at least two real scene images and it is described extremely
When few two opening and closing test the augmented reality lighting process at scene image, which comprises
The first pixel value difference at least two real scene images between arbitrary first image and the second image is calculated,
To obtain the first differential chart;
At least two opening and closing are calculated into the second pixel value difference between third image and the 4th image any in scene image, with
Obtain the second differential chart;
Wherein, the first image and the third image are under the first illumination parameter environment, second image and described
4th image is under the second illumination parameter environment, the light of the first illumination parameter environment and the second illumination parameter environment
According to parameter difference.
3. according to the method described in claim 2, it is characterized in that, obtain first differential chart and second differential chart it
Afterwards, the method also includes:
Statistics with histogram is carried out to first differential chart and second differential chart, obtains statistics with histogram result;
It wherein, include: first pixel value difference in the statistics with histogram result, second pixel value difference, with described the
Corresponding first pixel number of one pixel value difference, the second pixel number corresponding with second pixel value difference.
4. according to the method described in claim 3, it is characterized in that, after obtaining the statistics with histogram result, the method
Further include:
First differential chart is calculated relative to first figure based on first pixel value difference and first pixel number
First difference value of picture and second image;
Second differential chart is calculated relative to the third figure based on second pixel value difference and second pixel number
Second difference value of picture and the 4th image;
First difference value and second difference value are normalized, normalized difference value is obtained.
5. according to the method described in claim 4, it is characterized in that, using at least two real scene images and it is described extremely
Few two opening and closing carry out test to the augmented reality lighting process at scene image
Calculate pixel accounting of the object to be measured in any image, wherein any image is one of following image: described the
One image, second image, the third image and the 4th image;
Colour correction difference value is calculated based on the normalized difference value and the pixel accounting;
The colour correction difference value is compared with preset augmented reality illumination show value, to test the augmented reality
The display performance of lighting process.
6. according to the method described in claim 4, it is characterized in that, being more in real scene image and the synthesis scene image
When color channel images, using at least two real scene images and at least two opening and closing at scene image to the increasing
Strong reality lighting process carries out test and includes:
The pixel accounting for calculating each color channel of the object to be measured in any image, obtains multiple color pixel accountings,
In, any image includes: the first image, second image, the third image and the 4th image;
The color of each color pixel accounting in the multiple color pixel accounting is calculated based on the normalized difference value
Difference value is corrected, multiple colour correction difference values are obtained;
Calculate the multiple colour correction difference value and value;
Described and value is compared with preset augmented reality illumination show value, to test the display of the augmented reality illumination
Performance.
7. a kind of augmented reality lighting process test device characterized by comprising
First obtains module, corresponding extremely for obtaining augmented reality video image respectively under at least two illumination parameter environment
Few two real scene images;
Second obtains module, for obtaining augmented reality composite video image respectively under at least two illumination parameters environment
Corresponding at least two opening and closing are at scene image, wherein the augmented reality composite video image is by the augmented reality
It is arranged what test object determined in video image;
Test module, at least two real scene images described in and at least two opening and closing at scene image to described
Augmented reality lighting process is tested.
8. device according to claim 7, which is characterized in that described device further include:
First determining module, at least two real scene images described in utilizing and at least two opening and closing at scene image
Before testing the augmented reality lighting process, arbitrary first figure at least two real scene images is calculated
Picture and the first pixel value difference between the second image, to obtain the first differential chart;
Second determining module, for calculating at least two opening and closing at any between third image and the 4th image in scene image
The second pixel value difference, to obtain the second differential chart;
Wherein, the first image and the third image are under the first illumination parameter environment, second image and described
4th image is under the second illumination parameter environment, the light of the first illumination parameter environment and the second illumination parameter environment
According to parameter difference.
9. a kind of storage medium, which is characterized in that be stored with computer program in the storage medium, wherein the computer
Program is arranged to execute method described in any one of claim 1 to 6 when operation.
10. a kind of electronic device, including memory and processor, which is characterized in that be stored with computer journey in the memory
Sequence, the processor are arranged to run the computer program to execute side described in any one of claim 1 to 6
Method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811211184.0A CN109461145B (en) | 2018-10-17 | 2018-10-17 | Augmented reality illumination processing test method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811211184.0A CN109461145B (en) | 2018-10-17 | 2018-10-17 | Augmented reality illumination processing test method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109461145A true CN109461145A (en) | 2019-03-12 |
CN109461145B CN109461145B (en) | 2021-03-23 |
Family
ID=65607865
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811211184.0A Active CN109461145B (en) | 2018-10-17 | 2018-10-17 | Augmented reality illumination processing test method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109461145B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS54115808A (en) * | 1978-02-25 | 1979-09-08 | Japanese National Railways<Jnr> | Train centralized centrol board and method of controlling display thereof |
CN102893596A (en) * | 2010-08-06 | 2013-01-23 | 比兹摩德莱恩有限公司 | Apparatus and method for augmented reality |
CN103455978A (en) * | 2012-05-31 | 2013-12-18 | 索尼电脑娱乐欧洲有限公司 | Apparatus and method for augmenting a video image |
CN104160426A (en) * | 2012-02-22 | 2014-11-19 | 株式会社微网 | Augmented reality image processing device and method |
-
2018
- 2018-10-17 CN CN201811211184.0A patent/CN109461145B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS54115808A (en) * | 1978-02-25 | 1979-09-08 | Japanese National Railways<Jnr> | Train centralized centrol board and method of controlling display thereof |
CN102893596A (en) * | 2010-08-06 | 2013-01-23 | 比兹摩德莱恩有限公司 | Apparatus and method for augmented reality |
CN104160426A (en) * | 2012-02-22 | 2014-11-19 | 株式会社微网 | Augmented reality image processing device and method |
CN103455978A (en) * | 2012-05-31 | 2013-12-18 | 索尼电脑娱乐欧洲有限公司 | Apparatus and method for augmenting a video image |
Also Published As
Publication number | Publication date |
---|---|
CN109461145B (en) | 2021-03-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108229288B (en) | Neural network training and clothes color detection method and device, storage medium and electronic equipment | |
CN109829849B (en) | Training data generation method and device and terminal | |
KR100446184B1 (en) | Region Segmentation of Color Image | |
CN106797458B (en) | The virtual change of real object | |
US11379968B2 (en) | Inspection system, inspection method, program, and storage medium | |
CN105376527B (en) | Track drawing apparatus and track plotting method and track trace system | |
CN109754427A (en) | A kind of method and apparatus for calibration | |
CN111372122B (en) | Media content implantation method, model training method and related device | |
JP2012013675A (en) | Device and method for analyzing corrosion inside steel pipe | |
CN110956642A (en) | Multi-target tracking identification method, terminal and readable storage medium | |
JP2018522509A (en) | System and method for digitally overlaying an image with another image | |
CN110136083A (en) | A kind of the base map update method and device of combination interactive mode | |
CN108230434A (en) | Processing method, device, storage medium and the electronic device of image texture | |
CN108805872B (en) | Product detection method and device | |
Banić et al. | Using the random sprays Retinex algorithm for global illumination estimation | |
CN111768404A (en) | Mask appearance defect detection system, method and device and storage medium | |
Sabo et al. | A lightweight, inexpensive robotic system for insect vision | |
CN112488997B (en) | Method for detecting and evaluating color reproduction of ancient painting printed matter based on characteristic interpolation | |
CN105574844B (en) | Rdaiation response Function Estimation method and apparatus | |
CN113888472A (en) | Detection method and equipment for consumer electronics defects | |
CN109345560B (en) | Motion tracking precision testing method and device of augmented reality equipment | |
CN113096039A (en) | Depth information completion method based on infrared image and depth image | |
CN109461145A (en) | Augmented reality lighting process test method and device | |
CN110163914A (en) | The positioning of view-based access control model | |
Srivastava et al. | Color correction for object tracking across multiple cameras |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |