CN106384118A - Method and device for determining degree of dispersion in virtual reality - Google Patents
Method and device for determining degree of dispersion in virtual reality Download PDFInfo
- Publication number
- CN106384118A CN106384118A CN201610939887.XA CN201610939887A CN106384118A CN 106384118 A CN106384118 A CN 106384118A CN 201610939887 A CN201610939887 A CN 201610939887A CN 106384118 A CN106384118 A CN 106384118A
- Authority
- CN
- China
- Prior art keywords
- dispersion
- degree
- view data
- value
- rank
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/758—Involving statistics of pixels or of feature values, e.g. histogram matching
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Image Generation (AREA)
Abstract
Embodiments of the invention provide a method and a device for determining the degree of dispersion in virtual reality. The method comprises the following steps: processing the dispersion of received image data, and getting image data after dispersion processing; parsing the image data after dispersion processing to get the dispersion value of the image data after dispersion processing; and determining the level of dispersion degree of the image data after dispersion processing according to the dispersion value. Through the method, the degree of dispersion can be determined quantitatively in virtual reality, and the accuracy of dispersion degree determining is improved.
Description
Technical field
The invention relates to technical field of virtual reality, in more particularly, to a kind of virtual reality, determine degree of dispersion
Method or apparatus.
Background technology
Virtual reality technology combines computer graphics techniques, computer simulation technique, sensor technology, Display Technique etc.
Multiple science and technology, it creates a virtual information environment on Multi information space, and user can be made to have on the spot in person sinking
Leaching sense, has the interaction capacity perfect with environment, and contributes to inspiring design.
Due to the above-mentioned advantage of virtual reality technology, with an improved the Consumer's Experience of existing audio & video equipment, it has been directed to
Wider array of field, such as video conference, network technology and Distributed Computing Technology, and to distributed virtual reality development.Virtual reality
Technology has become the important means of new product designs exploitation.
Inventor finds that in prior art, at least there are the following problems:The virtual implementing helmet of the overwhelming majority is using now
When the reason color side of RGB, that is, dispersion phenomenon, this dispersion phenomenon all can occur in the marginal area of lens
The lens being applied in virtual reality use high-index material, and white light can be divided into multiple monochromatic light after lens;Mesh
The front standard determining degree of dispersion mainly passes through eye recognition, by eye observation, intuitively goes to judge the order of severity of dispersion,
But when the more close situation of degree of dispersion, eye recognition cannot determine accurate degree of dispersion.
Therefore, how quantitatively to determine degree of dispersion in virtual reality, become the technology of urgent need to resolve in prior art
Problem.
Content of the invention
In view of this, one of technical problem that the application solves is to provide determination degree of dispersion in a kind of virtual reality
Method or apparatus, in order to overcome the determination inaccurate problem of degree of dispersion in prior art.
The application one embodiment provides a kind of method being applied to and determining degree of dispersion in virtual reality, including:
The view data receiving is carried out with dispersion process, and obtains the view data after dispersion is processed;
Parse the view data after described dispersion is processed, obtain the dispersion value of described image data;
According to described dispersion value, determine the degree of dispersion rank of described image data.
Alternatively, in the application one specific embodiment, the described view data to reception carries out dispersion process, obtains color
Dissipate the view data after processing to include:
Inverse operation is carried out to the light forming described image data, white light is separated into monochromatic light;
The white light being synthesized using described monochromatic light, forms the view data after described dispersion is processed.
Alternatively, in the application one specific embodiment, the described dispersion of described parsing process after view data, obtain institute
The dispersion value stating view data includes:
Obtain the boundary image data in the view data after described dispersion is processed;
The dispersion value of the view data after the dispersion value of described boundary image data is processed as described dispersion.
Alternatively, in the application one specific embodiment, described according to described dispersion value, determine described image data
Degree of dispersion rank includes:
Corresponding with default each degree of dispersion rank for described dispersion value dispersion value scope is compared, obtains institute
State the dispersion value scope that dispersion value is fallen into;
The dispersion value scope corresponding degree of dispersion rank that described dispersion value is fallen into is as described image data
Degree of dispersion rank.
Alternatively, in the application one specific embodiment, described dispersion value is the view data after described dispersion is processed
Degree of dispersion with do not occur dispersion the degree of dispersion of image ratio.
The embodiment of the present application provides a kind of device being applied to determination degree of dispersion in virtual reality, including:
Dispersion processing module, for the view data receiving is carried out with dispersion process, and obtains the image after dispersion is processed
Data;
Numerical value obtains module, for parsing the view data after described dispersion is processed, obtains the dispersion of described image data
Numerical value;
Rank determination module, for according to described dispersion value, determining the degree of dispersion rank of described image data.
Alternatively, in the application one specific embodiment, described dispersion processing module includes:
White light separative element, for carrying out inverse operation to the light forming described image data, by white light separation monochromatizing
Light;
White light synthesis unit, for the white light synthesizing using described monochromatic light, forms the image after described dispersion is processed
Data.
Alternatively, in the application one specific embodiment, described numerical value obtains module and includes:
Border obtaining unit, for obtaining the boundary image data in the view data after described dispersion is processed;
Border substituting unit, for the image process the dispersion value of described boundary image data as described dispersion after
The dispersion value of data.
Alternatively, in the application one specific embodiment, described rank determination module includes:
Numerical value comparing unit, for by corresponding with default each degree of dispersion rank for described dispersion value dispersion value model
Enclose and be compared, obtain the dispersion value scope that described dispersion value is fallen into;
Rank corresponding unit, for the dispersion value scope corresponding degree of dispersion rank being fallen into described dispersion value
Degree of dispersion rank as described image data.
Alternatively, in the application one specific embodiment, described dispersion value is the view data after described dispersion is processed
Degree of dispersion with do not occur dispersion the degree of dispersion of image ratio.
From above technical scheme, the view data that the embodiment of the present application receives carries out dispersion process, and obtains dispersion
View data after process;Parse the view data after described dispersion is processed, obtain the dispersion value of described image data;According to
Described dispersion value, determines the degree of dispersion rank of described image data, quantitatively true in virtual reality by above method
Determine degree of dispersion, improve the accuracy determining degree of dispersion.
Brief description
In order to be illustrated more clearly that the embodiment of the present application or technical scheme of the prior art, below will be to embodiment or existing
Have technology description in required use accompanying drawing be briefly described it should be apparent that, drawings in the following description be only this
Some embodiments described in application embodiment, for those of ordinary skill in the art, can also obtain according to these accompanying drawings
Obtain other accompanying drawings.
Fig. 1 is the flow chart determining degree of dispersion method one embodiment in a kind of virtual reality of the application;
Fig. 2 is the flow chart determining another embodiment of degree of dispersion method in a kind of virtual reality of the application;
Fig. 3 is the flow chart determining another embodiment of degree of dispersion method in a kind of virtual reality of the application;
Fig. 4 is the flow chart determining another embodiment of degree of dispersion method in a kind of virtual reality of the application;
Fig. 5 is the structure chart determining degree of dispersion device one embodiment in a kind of virtual reality of the application;
Fig. 6 is the structure chart determining dispersion processing module in degree of dispersion device in a kind of virtual reality of the application;
Fig. 7 is the structure chart determining numerical value acquisition module in degree of dispersion device in a kind of virtual reality of the application;
Fig. 8 is the structure chart determining rank determination module in degree of dispersion device in a kind of virtual reality of the application;
Fig. 9 is the hardware architecture diagram that the application executes some electronic equipments determining degree of dispersion in virtual reality.
Specific embodiment
The embodiment of the present application carries out dispersion process to the view data receiving, and obtains the view data after dispersion is processed;
Parse the view data after described dispersion is processed, obtain the dispersion value of described image data;According to described dispersion value, determine
The degree of dispersion rank of described image data.The embodiment of the present application quantitatively determines degree of dispersion in virtual reality, improves
Determine the accuracy of degree of dispersion.
Certainly, implement the embodiment of the present application arbitrary technical scheme necessarily do not need to reach simultaneously above all excellent
Point.
In order that those skilled in the art more fully understand the technical scheme in the embodiment of the present application, below in conjunction with the application
Accompanying drawing in embodiment, is clearly and completely described the reality it is clear that described to the technical scheme in the embodiment of the present application
Applying example is only a part of embodiment of the embodiment of the present application, rather than whole embodiments.Based on the enforcement in the embodiment of the present application
Example, the every other embodiment that those of ordinary skill in the art are obtained, all should belong to the scope of the embodiment of the present application protection.
Further illustrate the embodiment of the present application with reference to the embodiment of the present application accompanying drawing to implement.
Embodiment one
As shown in figure 1, the application provides a kind of method determining degree of dispersion in virtual reality, including:
S1, carries out dispersion process, and obtains the view data after dispersion is processed to the view data receiving;
In the present embodiment, described dispersion is processed can be specifically:The light forming view data is passed through virtual reality
In lens.When described light passes through lens, according to the refraction principle of light, it is slanted through another kind of saturating from a kind of transparent medium
During bright medium, different monochromatic wavelengths is different, optical index is different, and refraction angle can be different, the meeting of the monochromatic direction of propagation
Change, thus defining dispersion.
In the present embodiment, the view data after dispersion is processed can be obtained using lighting apparatus, for example, using photographing unit or
Video camera shoots and obtains the view data being formed after light passes through lens.
View data after S2, parsing described dispersion process, obtains the dispersion value of described image data;
In the present embodiment, the view data after parsing dispersion is processed is specially:Obtain each pixel in view data
Corresponding colour (RGB).
In the present embodiment, the colour of each pixel that can be obtained by analysis, calculates the pixel that colour changes
The number of point, to obtain the dispersion value of the view data after described dispersion is processed.For example, traversal obtains each in view data
The colour of pixel, and the colour of pixel corresponding with original image is compared, if identical then it is assumed that current pixel
There is not dispersion in point, otherwise it is assumed that current pixel point occurs dispersion.Statistics obtains the number that dispersion pixel occurs, and according to this
Obtain the dispersion value of view data after described dispersion is processed.
In the present embodiment, the degree of dispersion of view data after described dispersion value can be processed for described dispersion with not
There is the ratio of the degree of dispersion of image of dispersion.
S3, according to described dispersion value, determine the degree of dispersion rank of described image data.
In the present embodiment, there is certain corresponding relation in degree of dispersion rank and dispersion value scope, this corresponding relation
Can be default.
In the present embodiment, the dispersion value scope corresponding degree of dispersion rank that described dispersion value is fallen into can be made
The degree of dispersion rank of the view data after processing for described dispersion.
The present embodiment carrying out dispersion process to view data, and parses the view data after dispersion is processed, and obtains image
The dispersion value of data, obtains the degree of dispersion rank of view data, quantitatively determines in virtual reality according to dispersion value
Degree of dispersion.
Embodiment two
As shown in Fig. 2 described step S1 includes:
S11, carries out inverse operation to the light forming described image data, white light is separated into monochromatic light.
In the present embodiment, described inverse operation that light is carried out is the inverse operation processing with respect to dispersion.
In the present embodiment, before the light forming view data passes through the lens in virtual reality, it is carried out with inverse behaviour
Deal with, so that the view data of pretreatment is carried out after dispersion process, image occurs the degree of dispersion to reduce.
In the present embodiment, inverse operation is carried out to the light forming described image data, including using software approach and optics
Means.Using software approach, view data is carried out with reversely Chromatically compensated process, for example, the pixel of a white light on image,
After software calculation process, obtain several monochromatic pixels so that the monochromatic light of several pixels passes through lens
When, recombine white light.Using optical instrument, inverse operation is carried out to the light forming described image data, white light is separated into
Monochromatic light.For example, because when dispersion is processed, using convex lens, therefore described inverse operation adopts concave lens, will form institute
The light stating view data passes through concave lens, and white light is separated into monochromatic light.
S12, using described monochromatic photosynthetic white light, forms the view data after described dispersion is processed.
In the present embodiment, described dispersion is processed, and specially the monochromatic light of pretreated acquisition is passed through described lens.Root
According to above-mentioned refraction principle, the monochromatic light ray of different refractivity has different refraction angles, when pretreated monochromatic light passes through
During lens, different refraction angles can allow monochromatic light be reunited into white light, obtains the view data after dispersion is processed.
In the present embodiment, above-mentioned steps S11 and S12 can be an overall step.For example, apochromat can be used
(Apochromatic Objective) obtains the view data after dispersion is processed, and this object lens employs special glass or fluorite etc.
Material is made, and this object lens can correct the aberration of red, green, blue three coloured light simultaneously, and the light forming view data is being worn
When crossing this object lens, white light can be separated into monochromatic light and monochromatic light recombined white light, with when occur the degree of dispersion to reduce.
This technical scheme is highly developed, will not be described here.
The present embodiment carries out inverse operation so that view data carries out dispersion process to the light forming described image data
When, image occurs the degree of dispersion to reduce, and improves the accuracy of the degree of dispersion rank determining described image data.
Embodiment three
As shown in figure 3, described step S2 includes:
S21, obtains the boundary image data in the view data after described dispersion is processed.
During realizing the present invention, inventor finds through substantial amounts of experiment, occurs the region of dispersion mainly to concentrate
The borderline region of described image, and then substantially there is not dispersion in the central area of described image.And the reason produce this phenomenon be,
The corresponding light of picture centre region passes through lens with right angle, and light does not reflect, and dispersion does not now occur.
In the present embodiment, obtain the view data of the image boundary after dispersion is processed.For example, by 200 × 200px's
Rgb value is (255,255,255) image, i.e. white image, selects the size of image and the size of lens in virtual implementing helmet
Close, so that the degree of dispersion determining is more accurate.Original image after pretreatment and dispersion are processed, using camera shoot through
The image of lens, and pixel is passed to parsing module, parsing module to original image outside expansion 10px, to contract
10px, to determine the bounds of image after dispersion process.Parsing module parses the pixel in bounds, and obtains correspondence
Rgb value.
S22, the dispersion value of described boundary image data is processed as described dispersion after view data dispersion number
Value.
State the corresponding RGB of each pixel of boundary image by what analysis obtained, judge whether this point occurs dispersion, obtain
The dispersion value of the view data after described dispersion process.For example, traversal obtains the rgb value of each pixel, and with (255,
255,255) contrast, if identical then it is assumed that there is not dispersion in current pixel point, otherwise it is assumed that current pixel point occur dispersion.
Statistics obtains the number that dispersion pixel occurs.
The present embodiment only processes the boundary image data of view data, under the premise of ensureing that process dispersion data is accurate,
Reduce the operand of processor.
Example IV
As shown in figure 4, described step S3 includes:
S31, corresponding with default each degree of dispersion rank for described dispersion value dispersion value scope is compared, obtains
Obtain the dispersion value scope that described dispersion value is fallen into.
In the present embodiment, provide a kind of criterion of dispersion rank, specifically, described dispersion value is that dispersion occurs
The number of pixel with do not occur dispersion the ratio of pixel number or, occur dispersion the number of pixel with
The ratio of original image pixels point number.
In the present embodiment, described dispersion value scope can be determined according to some default threshold values;For example, can be using the
One threshold value and Second Threshold determine the first span, determine the second span using Second Threshold and the 3rd threshold value, and according to
This analogizes.The number of described span can be configured according to actual needs with the selection of threshold value.
In the present embodiment, described dispersion value and default threshold value can be compared operation, so that it is determined that described gradient
The gradient magnitude scope that numerical value falls into.
S32, the dispersion value scope corresponding degree of dispersion rank being fallen into described dispersion value are as described image
The degree of dispersion rank of data.
Specifically, the rank of degree of dispersion can be determined according to the ratio that dispersion pixel occurs, for example, 0~10% picture
There is dispersion in vegetarian refreshments, then degree of dispersion is rank one, and 10~20% pixel occurs dispersion, then degree of dispersion is rank two,
There is dispersion in 20~30% pixel, then degree of dispersion is rank three, and the rest may be inferred.According to such criterion, obtain
Degree of dispersion rank.
Embodiment five
As shown in figure 5, the application provides the device determining degree of dispersion in a kind of virtual reality, including:
Dispersion processing module 40, for the view data receiving is carried out with dispersion process, and obtains the figure after dispersion is processed
As data;
In the present embodiment, described dispersion is processed can be specifically:The light forming view data is passed through virtual reality
In lens.When described light passes through lens, according to the refraction principle of light, it is slanted through another kind of saturating from a kind of transparent medium
During bright medium, different monochromatic wavelengths is different, optical index is different, and refraction angle can be different, the meeting of the monochromatic direction of propagation
Change, thus defining dispersion.
In the present embodiment, the view data after dispersion is processed can be obtained using lighting apparatus, for example, using photographing unit or
Video camera shoots and obtains the view data being formed after light passes through lens.
Numerical value obtains module 50, for parsing the view data after described dispersion is processed, obtains the color of described image data
Scattered numerical value;
In the present embodiment, the view data after parsing dispersion is processed is specially:Obtain each pixel in view data
Corresponding colour (RGB).
In the present embodiment, the colour of each pixel that can be obtained by analysis, calculates the pixel that colour changes
The number of point, to obtain the dispersion value of the view data after described dispersion is processed.For example, traversal obtains each in view data
The colour of pixel, and the colour of pixel corresponding with original image is compared, if identical then it is assumed that current pixel
There is not dispersion in point, otherwise it is assumed that current pixel point occurs dispersion.Statistics obtains the number that dispersion pixel occurs, and according to this
Obtain the dispersion value of view data after described dispersion is processed.
In the present embodiment, the degree of dispersion of view data after described dispersion value can be processed for described dispersion with not
There is the ratio of the degree of dispersion of image of dispersion.
Rank determination module 60, for according to described dispersion value, determining the degree of dispersion rank of described image data.
In the present embodiment, there is certain corresponding relation in degree of dispersion rank and dispersion value scope, this corresponding relation
Can be default.
In the present embodiment, the dispersion value scope corresponding degree of dispersion rank that described dispersion value is fallen into can be made
The degree of dispersion rank of the view data after processing for described dispersion.
The present embodiment carrying out dispersion process to view data, and parses the view data after dispersion is processed, and obtains image
The dispersion value of data, obtains the degree of dispersion rank of view data, quantitatively determines in virtual reality according to dispersion value
Degree of dispersion.
Embodiment six
As shown in fig. 6, described dispersion processing module 40 includes:
White light separative element 41, for carrying out inverse operation to the light forming described image data, white light is separated into list
Coloured light.
In the present embodiment, described inverse operation that light is carried out is the inverse operation processing with respect to dispersion.
In the present embodiment, before the light forming view data passes through the lens in virtual reality, it is carried out with inverse behaviour
Deal with, so that the view data of pretreatment is carried out after dispersion process, image occurs the degree of dispersion to reduce.
In the present embodiment, inverse operation is carried out to the light forming described image data, including using software approach and optics
Means.Using software approach, view data is carried out with reversely Chromatically compensated process, for example, the pixel of a white light on image,
After software calculation process, obtain several monochromatic pixels so that the monochromatic light of several pixels passes through lens
When, recombine white light.Using optical instrument, inverse operation is carried out to the light forming described image data, white light is separated into
Monochromatic light.For example, because when dispersion is processed, using convex lens, therefore described inverse operation adopts concave lens, will form institute
The light stating view data passes through concave lens, and white light is separated into monochromatic light.
White light synthesis unit 42, for using described monochromatic photosynthetic white light, forming the image after described dispersion is processed
Data.
In the present embodiment, described dispersion is processed, and specially the monochromatic light of pretreated acquisition is passed through described lens.Root
According to above-mentioned refraction principle, the monochromatic light ray of different refractivity has different refraction angles, when pretreated monochromatic light passes through
During lens, different refraction angles can allow monochromatic light be reunited into white light, obtains the view data after dispersion is processed.
In the present embodiment, white light separative element 41 can be overall for one with white light synthesis unit 42.For example, can be using multiple colour killing
Difference object lens (Apochromatic Objective) obtain dispersion process after view data, this object lens employ special glass or
The materials such as fluorite are made, and this object lens can correct the aberration of red, green, blue three coloured light simultaneously, form the light of view data
White light, when through this object lens, can be separated into monochromatic light and monochromatic light is recombined white light by line, with when there is the journey of dispersion
Degree reduces.This technical scheme is highly developed, will not be described here.
The present embodiment carries out inverse operation so that view data carries out dispersion process to the light forming described image data
When, image occurs the degree of dispersion to reduce, and improves the accuracy of the degree of dispersion rank determining described image data.
Embodiment seven
As shown in fig. 7, described numerical value obtains module 50 including:
Border obtaining unit 51, for obtaining the boundary image data in the view data after described dispersion is processed.
During realizing the present invention, inventor finds through substantial amounts of experiment, occurs the region of dispersion mainly to concentrate
The borderline region of described image, and then substantially there is not dispersion in the central area of described image.And the reason produce this phenomenon be:
The corresponding light of picture centre region passes through lens with right angle, and light does not reflect, and dispersion does not now occur.
In the present embodiment, obtain the view data of the image boundary after dispersion is processed.For example, by 200 × 200px's
Rgb value is (255,255,255) image, i.e. white image, selects the size of image and the size of lens in virtual implementing helmet
Close, so that the degree of dispersion determining is more accurate.Original image after pretreatment and dispersion are processed, using camera shoot through
The image of lens, and pixel is passed to parsing module, parsing module to original image outside expansion 10px, to contract
10px, to determine the bounds of image after dispersion process.Parsing module parses the pixel in bounds, and obtains correspondence
Rgb value.
Border substituting unit 52, for the figure process the dispersion value of described boundary image data as described dispersion after
Dispersion value as data.
State the corresponding RGB of each pixel of boundary image by what analysis obtained, judge whether this point occurs dispersion, obtain
The dispersion value of the view data after described dispersion process.For example, traversal obtains the rgb value of each pixel, and with (255,
255,255) contrast, if identical then it is assumed that there is not dispersion in current pixel point, otherwise it is assumed that current pixel point occur dispersion.
Statistics obtains the number that dispersion pixel occurs.
The present embodiment only processes the boundary image data of view data, under the premise of ensureing that process dispersion data is accurate,
Reduce the operand of processor.
Embodiment eight
As shown in figure 8, described rank determination module 60 includes:
Numerical value comparing unit 61, for by corresponding with default each degree of dispersion rank for described dispersion value dispersion value
Scope is compared, and obtains the dispersion value scope that described dispersion value is fallen into.
In the present embodiment, provide a kind of criterion of dispersion rank, specifically, described dispersion value is that dispersion occurs
The number of pixel with do not occur dispersion the ratio of pixel number or, occur dispersion the number of pixel with
The ratio of original image pixels point number.
In the present embodiment, described dispersion value scope can be determined according to some default threshold values;For example, can be using the
One threshold value and Second Threshold determine the first span, determine the second span using Second Threshold and the 3rd threshold value, and according to
This analogizes.The number of described span can be configured according to actual needs with the selection of threshold value.
In the present embodiment, described dispersion value and default threshold value can be compared operation, so that it is determined that described gradient
The gradient magnitude scope that numerical value falls into.
Rank corresponding unit 62, for the dispersion value scope corresponding degree of dispersion level being fallen into described dispersion value
Not as the degree of dispersion rank of described image data.
Specifically, the rank of degree of dispersion can be determined according to the ratio that dispersion pixel occurs, for example, 0~10% picture
There is dispersion in vegetarian refreshments, then degree of dispersion is rank one, and 10~20% pixel occurs dispersion, then degree of dispersion is rank two,
There is dispersion in 20~30% pixel, then degree of dispersion is rank three, and the rest may be inferred.According to such criterion, obtain
Degree of dispersion rank.
Fig. 9 is the hardware architecture diagram that the application executes some electronic equipments determining degree of dispersion in virtual reality.
According to Fig. 9, this equipment includes:
One or more processors 91 and memorizer 92, in Fig. 9 taking a processor 91 as a example.
Determine that the equipment of degree of dispersion can also include in execution virtual reality:Input equipment 93 and output device 93.
Processor 91, memorizer 92, input equipment 93 and output device 94 can be connected by bus or other modes,
In Fig. 9 taking connected by bus as a example.
Memorizer 92, as a kind of non-volatile computer readable storage medium storing program for executing, can be used for storing non-volatile software journey
Degree of dispersion is determined in virtual reality in sequence, non-volatile computer executable program and module, such as the embodiment of the present application
Corresponding programmed instruction/module.Processor 91 pass through to run the non-volatile software program in memorizer 92 of being stored in, instruction with
And module, thus the various function application of execute server and data processing, that is, realize virtual existing in said method embodiment
Degree of dispersion is determined in reality.
Memorizer 92 can include storing program area and storage data field, wherein, storing program area can storage program area,
Application program required at least one function;Storage data field can store according to the use determining degree of dispersion in virtual reality
Data being created etc..Additionally, memorizer 92 can include high-speed random access memory 92, non-volatile depositing can also be included
Reservoir 92, for example, at least one disk memory 92, flush memory device or other non-volatile solid state memories 92.One
In a little embodiments, memorizer 92 is optional to include the memorizer 92 remotely located with respect to processor 91, these remote memories 92
Degree of dispersion can be determined to virtual reality by network connection.The example of above-mentioned network includes but is not limited to the Internet, enterprise
Industry in-house network, LAN, mobile radio communication and combinations thereof.
Input equipment 93 can the numeral of receives input or character information, and produce with virtual reality in determine degree of dispersion
User setup and function control relevant key signals input.Input equipment 93 may include the equipment such as pressing module.
One or more of module stores in described memorizer 92, when by one or more of processors 91
During execution, execute and in the virtual reality in above-mentioned any means embodiment, determine degree of dispersion.
The said goods can perform the method that the embodiment of the present application is provided, and possesses the corresponding functional module of execution method and has
Beneficial effect.The not ins and outs of detailed description in the present embodiment, can be found in the method that the embodiment of the present application is provided.
The electronic equipment of the embodiment of the present application exists in a variety of forms, including but not limited to:
(1) mobile communication equipment:The feature of this kind equipment is that possess mobile communication function, and to provide speech, data
Communicate as main target.This Terminal Type includes:Smart mobile phone (such as iPhone), multimedia handset, feature mobile phone, and low
End mobile phone etc..
(2) super mobile personal computer equipment:This kind equipment belongs to the category of personal computer, has calculating and processes work(
Can, typically also possess mobile Internet access characteristic.This Terminal Type includes:PDA, MID and UMPC equipment etc., such as iPad.
(3) portable entertainment device:This kind equipment can show and play content of multimedia.This kind equipment includes:Audio frequency,
Video player (such as iPod), handheld device, e-book, and intelligent toy and portable car-mounted navigator.
(4) server:There is provided the equipment of the service of calculating, the composition of server includes processor 91, hard disk, internal memory, system
Bus etc., server is similar with general computer architecture, but due to needing to provide highly reliable service, is therefore processing energy
The aspects such as power, stability, reliability, safety, extensibility, manageability require higher.
(5) other have the electronic installation of data interaction function.
Device embodiment described above is only that schematically the wherein said module illustrating as separating component can
To be or to may not be physically separate, as the part that module shows can be or may not be physics mould
Block, you can with positioned at a place, or can also be distributed on multiple mixed-media network modules mixed-medias.Can be selected it according to the actual needs
In the purpose to realize this embodiment scheme for some or all of module.Those of ordinary skill in the art are not paying creativeness
Work in the case of, you can to understand and to implement.
Through the above description of the embodiments, those skilled in the art can be understood that each embodiment can
Mode by software plus necessary general hardware platform to be realized naturally it is also possible to pass through hardware.Based on such understanding, on
That states that technical scheme substantially contributes to prior art in other words partly can be embodied in the form of software product, should
Computer software product can store in a computer-readable storage medium, described computer readable recording medium storing program for performing include for
The readable form storage of computer (such as computer) or any mechanism of transmission information.For example, machine readable media is only included
Read memorizer (ROM), random access memory (RAM), magnetic disk storage medium, optical storage media, flash medium, electricity, light,
Transmitting signal (for example, carrier wave, infrared signal, digital signal etc.) of sound or other forms etc., this computer software product includes
Some instructions are with so that computer equipment (can be personal computer, server, or network equipment an etc.) execution is each
Individual embodiment or some partly described methods of embodiment.
Finally it should be noted that:Above example is only in order to illustrating the technical scheme of the embodiment of the present application, rather than it is limited
System;Although being described in detail to the application with reference to the foregoing embodiments, it will be understood by those within the art that:Its
Still the technical scheme described in foregoing embodiments can be modified, or wherein some technical characteristics are equal to
Replace;And these modifications or replacement, do not make the essence of appropriate technical solution depart from the application each embodiment technical scheme
Spirit and scope.
Claims (10)
1. determine the method for degree of dispersion in a kind of virtual reality it is characterised in that including:
The view data receiving is carried out with dispersion process, and obtains the view data after dispersion is processed;
Parse the view data after described dispersion is processed, obtain the dispersion value of described image data;
According to described dispersion value, determine the degree of dispersion rank of described image data.
2. in virtual reality as claimed in claim 1 determine degree of dispersion method it is characterised in that described to receive figure
As data carries out dispersion process, and the view data obtaining after dispersion process includes:
Inverse operation is carried out to the light forming described image data, white light is separated into monochromatic light;
The white light being synthesized using described monochromatic light, forms the view data after described dispersion is processed.
3. in virtual reality as claimed in claim 1, determine the method for degree of dispersion it is characterised in that the described color of described parsing
Dissipate the view data after processing, the dispersion value obtaining described image data includes:
Obtain the boundary image data in the view data after described dispersion is processed;
The dispersion value of the view data after the dispersion value of described boundary image data is processed as described dispersion.
4. in virtual reality as claimed in claim 1 determine degree of dispersion method it is characterised in that described according to described color
Scattered numerical value, determines that the degree of dispersion rank of described image data includes:
Corresponding with default each degree of dispersion rank for described dispersion value dispersion value scope is compared, obtains described color
The dispersion value scope that scattered numerical value is fallen into;
The dispersion value scope corresponding degree of dispersion rank that described dispersion value is fallen into is as the color of described image data
Scattered degree rank.
5. in virtual reality as claimed in claim 1, determine the method for degree of dispersion it is characterised in that described dispersion value is
Described dispersion process after the degree of dispersion of view data and the degree of dispersion of image that dispersion does not occur ratio.
6. determine the device of degree of dispersion in a kind of virtual reality it is characterised in that including:
Dispersion processing module, for the view data receiving is carried out with dispersion process, and obtains the view data after dispersion is processed;
Numerical value obtains module, for parsing the view data after described dispersion is processed, obtains the dispersion value of described image data;
Rank determination module, for according to described dispersion value, determining the degree of dispersion rank of described image data.
7. determine the device of degree of dispersion in virtual reality as claimed in claim 6 it is characterised in that described dispersion processes mould
Block includes:
White light separative element, for carrying out inverse operation to the light forming described image data, white light is separated into monochromatic light;
White light synthesis unit, for the white light synthesizing using described monochromatic light, forms the view data after described dispersion is processed.
8. determine the device of degree of dispersion in virtual reality as claimed in claim 6 it is characterised in that described numerical value obtains mould
Block includes:
Border obtaining unit, for obtaining the boundary image data in the view data after described dispersion is processed;
Border substituting unit, for the view data process the dispersion value of described boundary image data as described dispersion after
Dispersion value.
9. in virtual reality as claimed in claim 6, determine the device of degree of dispersion it is characterised in that described rank determines mould
Block includes:
Numerical value comparing unit, for entering corresponding with default each degree of dispersion rank for described dispersion value dispersion value scope
Row compares, and obtains the dispersion value scope that described dispersion value is fallen into;
Rank corresponding unit, for dispersion value scope corresponding degree of dispersion rank that described dispersion value is fallen into as
The degree of dispersion rank of described image data.
10. in virtual reality as claimed in claim 6, determine the device of degree of dispersion it is characterised in that described dispersion value
For described dispersion process after view data degree of dispersion with do not occur dispersion image degree of dispersion ratio.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610939887.XA CN106384118A (en) | 2016-10-24 | 2016-10-24 | Method and device for determining degree of dispersion in virtual reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610939887.XA CN106384118A (en) | 2016-10-24 | 2016-10-24 | Method and device for determining degree of dispersion in virtual reality |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106384118A true CN106384118A (en) | 2017-02-08 |
Family
ID=57958141
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610939887.XA Pending CN106384118A (en) | 2016-10-24 | 2016-10-24 | Method and device for determining degree of dispersion in virtual reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106384118A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108426702A (en) * | 2018-01-19 | 2018-08-21 | 华勤通讯技术有限公司 | The dispersion measurement device and method of augmented reality equipment |
CN112862930A (en) * | 2021-03-15 | 2021-05-28 | 网易(杭州)网络有限公司 | Game scene processing method and device and electronic equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105785487A (en) * | 2016-05-11 | 2016-07-20 | 东莞市深蓝光电科技有限公司 | Biconvex lens for VR glasses |
CN105809644A (en) * | 2016-03-15 | 2016-07-27 | 深圳英飞拓科技股份有限公司 | Image edge false color inhabitation method and apparatus |
CN105979252A (en) * | 2015-12-03 | 2016-09-28 | 乐视致新电子科技(天津)有限公司 | Test method and device |
-
2016
- 2016-10-24 CN CN201610939887.XA patent/CN106384118A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105979252A (en) * | 2015-12-03 | 2016-09-28 | 乐视致新电子科技(天津)有限公司 | Test method and device |
CN105809644A (en) * | 2016-03-15 | 2016-07-27 | 深圳英飞拓科技股份有限公司 | Image edge false color inhabitation method and apparatus |
CN105785487A (en) * | 2016-05-11 | 2016-07-20 | 东莞市深蓝光电科技有限公司 | Biconvex lens for VR glasses |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108426702A (en) * | 2018-01-19 | 2018-08-21 | 华勤通讯技术有限公司 | The dispersion measurement device and method of augmented reality equipment |
CN108426702B (en) * | 2018-01-19 | 2020-06-02 | 华勤通讯技术有限公司 | Dispersion measurement device and method of augmented reality equipment |
CN112862930A (en) * | 2021-03-15 | 2021-05-28 | 网易(杭州)网络有限公司 | Game scene processing method and device and electronic equipment |
CN112862930B (en) * | 2021-03-15 | 2024-04-12 | 网易(杭州)网络有限公司 | Game scene processing method and device and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109618222B (en) | A kind of splicing video generation method, device, terminal device and storage medium | |
US11055516B2 (en) | Behavior prediction method, behavior prediction system, and non-transitory recording medium | |
KR101282196B1 (en) | Apparatus and method for separating foreground and background of based codebook In a multi-view image | |
CN110321958A (en) | Training method, the video similarity of neural network model determine method | |
CN108416744B (en) | Image processing method, device, equipment and computer readable storage medium | |
CN109145970B (en) | Image-based question and answer processing method and device, electronic equipment and storage medium | |
CN112307886A (en) | Pedestrian re-identification method and device | |
CN113642673A (en) | Image generation method, device, equipment and storage medium | |
CN103632337A (en) | Real-time order-independent transparent rendering | |
CN108197203A (en) | A kind of shop front head figure selection method, device, server and storage medium | |
CN112990440A (en) | Data quantization method for neural network model, readable medium, and electronic device | |
CN113822322A (en) | Image processing model training method and text processing model training method | |
CN113724132B (en) | Image style migration processing method and device, electronic equipment and storage medium | |
CN106384118A (en) | Method and device for determining degree of dispersion in virtual reality | |
CN114444653A (en) | Method and system for evaluating influence of data augmentation on deep learning model performance | |
CN106530286A (en) | Method and device for determining definition level | |
CN110619602B (en) | Image generation method and device, electronic equipment and storage medium | |
CN106537460B (en) | For the automatic dividing technique of multistage pixel shader | |
CN116630630A (en) | Semantic segmentation method, semantic segmentation device, computer equipment and computer readable storage medium | |
US11315291B1 (en) | Shine visualization | |
CN113225586B (en) | Video processing method and device, electronic equipment and storage medium | |
CN116467153A (en) | Data processing method, device, computer equipment and storage medium | |
CN109741250B (en) | Image processing method and device, storage medium and electronic equipment | |
CN112819079A (en) | Model sampling algorithm matching method and device and electronic equipment | |
CN115965848B (en) | Image processing method and related device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20170208 |
|
WD01 | Invention patent application deemed withdrawn after publication |