CN106534835B - A kind of image processing method and device - Google Patents
A kind of image processing method and device Download PDFInfo
- Publication number
- CN106534835B CN106534835B CN201611082577.7A CN201611082577A CN106534835B CN 106534835 B CN106534835 B CN 106534835B CN 201611082577 A CN201611082577 A CN 201611082577A CN 106534835 B CN106534835 B CN 106534835B
- Authority
- CN
- China
- Prior art keywords
- color value
- light emitting
- emitting source
- virtual reality
- reality device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/15—Processing image signals for colour aspects of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/324—Colour aspects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Processing (AREA)
Abstract
The embodiment of the invention discloses a kind of image processing method and devices, for reducing the power consumption of virtual reality device.Present invention method includes:The virtual reality device obtains the color value of light emitting source, and the light emitting source is the light emitting source in virtual scene;The virtual reality device determines the image of virtual scene picture;The virtual reality device determines that target area, the target area are the region irradiated by the light emitting source in the virtual scene from the image of the virtual scene picture;The virtual reality device determines target color values according to the color value of the light emitting source and the color value of the target area;The color value of the target area is replaced with the target color values by the virtual reality device.
Description
Technical field
The present invention relates to image processing techniques more particularly to a kind of image processing methods and device.
Background technology
Virtual reality (Virtual Reality, VR) technology is a meta-synthetic engineering, be related to computer graphics,
The fields such as human-computer interaction technology, sensing technology, artificial intelligence, it with computer generate three-dimensional true to nature depending on, listen, the senses such as smell
Feel, make one as participant through appropriate device, experience and reciprocation are carried out to virtual world naturally.User is into line position
When setting mobile, computer can carry out complicated operation immediately, pass the accurate worlds 3D image back generation telepresenc.The technology collection
At computer graphical (Computer Graphics, CG) technology, computer simulation technique, artificial intelligence, sensing technology, show
The later development for showing the technologies such as technology, network parallel processing is a kind of high-tech mould generated by computer technology auxiliary
Quasi- system.
In the prior art, in order to obtain virtual environment true to nature, relatively high, virtual ring is required to computer graphics techniques
The scene effect of simulation real world is needed in border, wherein including just simulating true lighting effect.For example, true
In cinema watch film when, generally only motion picture screen be luminous, i.e., only motion picture screen is light emitting source, remaining is
Illuminated object, such as the chair etc. in cinema.In true illumination, the light that light source is sent out is constantly illuminated right
As middle multipath reflection is same, finally just there is the lighting effect that naked eyes are seen.Therefore, in order to simulate in true cinema
True lighting effect typically by 3D graphical tools, such as Unity3D softwares, utilizes Unity3D in the prior art
In engine in virtual scene image carry out multipath reflection calculate to simulate true lighting effect, however, carry out it is more
The calculation amount for reflecting calculating again is larger, higher to the power consumption of virtual reality device.
Invention content
An embodiment of the present invention provides a kind of image processing method and devices, for reducing the power consumption of virtual reality device
It is higher.
In view of this, first aspect of the embodiment of the present invention provides a kind of image processing method, set applied to virtual reality
Standby, this method includes:
The virtual reality device obtains the color value of light emitting source, and the light emitting source is the light emitting source in virtual scene;
The virtual reality device determines the image of virtual scene picture;
The virtual reality device determines target area from the image of the virtual scene picture, and the target area is
In the virtual scene, the region irradiated by the light emitting source;
The virtual reality device determines mesh according to the color value of the light emitting source and the color value of the target area
Mark color value;
The color value of the target area is replaced with the target color values by the virtual reality device.
In a kind of possible realization, the color value that the virtual reality device obtains light emitting source includes:
The virtual reality device obtains the color value of each pixel in the region corresponding to the light emitting source;
The virtual reality device calculates the average value of the color value of each pixel, by the face of each pixel
Color value of the average value of color value as the light emitting source.
In a kind of possible realization, the color value that the virtual reality device obtains light emitting source includes:
Region division corresponding to the light emitting source is light point area and GLOW INCLUSION AREA by the virtual reality device;
The virtual reality device obtains the color value of each pixel of the light point area, and obtains the GLOW INCLUSION AREA
Each pixel color value;
The virtual reality device calculates the first average value, and root according to the color value of each pixel of the light point area
The second average value is calculated according to the color value of each pixel of the GLOW INCLUSION AREA;
The virtual reality device is weighted to obtain the first color using the first weights to first average value
Value, and using the second weights second average value is weighted to obtain the second color value, first weights are more than
Second weights;
The virtual reality device calculates the color of the light emitting source according to first color value and the second color value
Value.
In a kind of possible realization, color value and the target of the virtual reality device according to the light emitting source
The color value in region determines that target color values include:
The virtual reality device determines the reflection coefficient of the target area, the reflection coefficient and the target area
The reflectance positive correlation of the material of corresponding illuminated object;
The virtual reality device according to the scene between the reflection coefficient, the light emitting source and the target area away from
From and the penetrating degree parameter of scene calculate the tone weights of the target area;
The virtual reality device is according to the color value of the light emitting source, the color value of the target area and the mesh
The tone weights in mark region determine target color values.
In a kind of possible realization, the virtual reality device according to the reflection coefficient, the light emitting source with it is described
The tone weights that the penetrating degree parameter of scene distance and scene between target area calculates the target area include:
The virtual reality device determines attenuation coefficient according to the scene distance and the penetrating degree parameter of the scene, institute
Attenuation coefficient and the scene distance positive correlation are stated, and negatively correlated with the penetrating degree parameter of the scene;
The virtual reality device, which decays to the light intensity parameter of the light emitting source according to the attenuation coefficient, to be faced
Shi Guangqiang parameters;
The virtual reality device is modified to obtain target light by the reflection coefficient to the interim light intensity parameter
Strong parameter;
The virtual reality device calculates the tone weights of the target area according to the target light intensity parameter.
The virtual reality device obtains the color value of light emitting source, and the light emitting source is the light emitting source in virtual scene;
The virtual reality device determines the image of virtual scene picture;
The virtual reality device determines target area from the image of the virtual scene picture, and the target area is
In the virtual scene, the region irradiated by the light emitting source;
The virtual reality device determines mesh according to the color value of the light emitting source and the color value of the target area
Mark color value;
The color value of the target area is replaced with the target color values by the virtual reality device.
Based on the image processing method of above-mentioned first aspect, second aspect of the embodiment of the present invention provides a kind of image procossing
Device, is applied to virtual reality device, which includes:
Acquisition module, the color value for obtaining light emitting source, the light emitting source are the light emitting source in virtual scene;
First determining module, the image for determining virtual scene picture;
Second determining module, for being determined from the image for the virtual scene picture that first determining module determines
Target area, the target area are the region irradiated by the light emitting source in the virtual scene;
Third determining module, the color value of the light emitting source for being obtained according to the acquisition module and described the
The color value for the target area that two determining modules determine determines target color values;
The color value of replacement module, the target area for determining second determining module replaces with described
The target color values that three determining modules determine.
In a kind of possible realization, the acquisition module includes:
First acquisition unit, the color value of each pixel for obtaining the region corresponding to the light emitting source;
First computing unit, the color value of each pixel for calculating the first acquisition unit acquisition are averaged
Value, using the average value of the color value of each pixel as the color value of the light emitting source.
In a kind of possible realization, the acquisition module includes:
Division unit, for being light point area and GLOW INCLUSION AREA by the region division corresponding to the light emitting source;
Second acquisition unit, the color of each pixel for obtaining the light point area that the division unit divides
Value, and obtain the color value of each pixel for the GLOW INCLUSION AREA that the division unit divides;
Second computing unit, the face of each pixel of the light point area for being obtained according to the second acquisition unit
Color value calculates the first average value, and the color value of each pixel according to the GLOW INCLUSION AREA of second acquisition unit acquisition
Calculate the second average value;
Second computing unit is also used for the first weights and first average value is weighted to obtain
One color value, and it is weighted to obtain the second color value, first power to second average value using the second weights
Value is more than second weights;
Second computing unit is additionally operable to calculate the light emitting source according to first color value and the second color value
Color value.
In a kind of possible realization, the third module includes:
First determination unit, the reflection coefficient for determining the target area, the reflection coefficient and the target area
The reflectance positive correlation of the material of illuminated object corresponding to domain;
Third computing unit, for according to the scene between the reflection coefficient, the light emitting source and the target area
Distance and the penetrating degree parameter of scene calculate the tone weights of the target area;
Second determination unit, for according to the color value of the light emitting source, the color value of the target area and described
The tone weights of target area determine target color values.
In a kind of possible realization, the third computing unit is specifically used for:
Determine attenuation coefficient according to the scene distance and the penetrating degree parameter of the scene, the attenuation coefficient with it is described
Scene distance positive correlation, and it is negatively correlated with the penetrating degree parameter of the scene;
The light intensity parameter of the light emitting source is decayed according to the attenuation coefficient to obtain interim light intensity parameter;
The interim light intensity parameter is modified to obtain target light intensity parameter by the reflection coefficient;
The tone weights of the target area are calculated according to the target light intensity parameter.
As can be seen from the above technical solutions, the embodiment of the present invention proposes a kind of image processing method and corresponding figure
As processing unit, it is applied in virtual reality device, so that virtual reality device obtains the color of the light emitting source in virtual scene
Value, determines the image of virtual scene picture, target area is determined from the image of virtual scene picture, wherein the target area
For in virtual scene, the region irradiated by light emitting source is determined according to the color value of light emitting source and the color value of target area
The color value of target area is finally replaced with target color values by target color values.I.e. in embodiments of the present invention, directly pass through
Target color values are mapped on irradiated area, and to obtain in virtual scene picture, the lighting effect of irradiated area avoids
The process that multipath reflection calculates, reduces calculation amount, reduces the power consumption of virtual reality device.
Description of the drawings
To describe the technical solutions in the embodiments of the present invention more clearly, make required in being described below to embodiment
Attached drawing is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the invention, for
For those skilled in the art, other drawings may also be obtained based on these drawings.
Fig. 1 is a kind of image processing method one embodiment flow diagram of the embodiment of the present invention;
Fig. 2 is a kind of another embodiment flow diagram of image processing method of the embodiment of the present invention;
Fig. 3 is a kind of image processing apparatus one embodiment structural schematic diagram of the embodiment of the present invention.
Specific implementation mode
An embodiment of the present invention provides a kind of image processing method and devices, are applied to virtual reality device, for dropping
The power consumption of low virtual reality device.
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation describes, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, the every other implementation that those skilled in the art are obtained without creative efforts
Example belongs to the range of protection of the embodiment of the present invention.
Term " first ", " second ", " third " in description and claims of this specification and above-mentioned attached drawing, "
The (if present)s such as four " are for distinguishing similar object, without being used to describe specific sequence or precedence.It should manage
The data that solution uses in this way can be interchanged in the appropriate case, so that the embodiments described herein can be in addition to illustrating herein
Or the sequence other than the content of description is implemented.In addition, term " comprising " and " having " and their any deformation, it is intended that
Cover it is non-exclusive include, for example, containing the process of series of steps or unit, method, system, product or equipment need not limit
In those of clearly listing step or unit, but may include not listing clearly or for these processes, method, production
The intrinsic other steps of product or equipment or unit.
The embodiment of the present invention is suitable for virtual reality device, for the ease of understanding and describing, below first to of the invention real
It applies the applicable virtual reality device of example and does a simple introduction:
Virtual reality device in the embodiment of the present invention refers to virtual reality headset equipment, and abbreviation VR is aobvious, generally
For, VR aobvious to be divided into three classes:Circumscribed head is aobvious, integral type head is aobvious, box of mobile telephone subheader is aobvious, i.e., using mobile phone as the VR of display
Head is aobvious (also known as VR glasses).Those skilled in the art will be seen that in general, a complete virtual reality system includes void
Near-ring border, using high-performance computer as the virtual environment processor of core, with display system, with speech recognition, sound rendering with
Sound be positioned as the auditory system of core, the body dimension posture based on azimuth tracker, data glove and data suit with
Track equipment and the sense of taste, smell, tactile and force feedback system functional unit or module.It should be noted that above-mentioned to this
The description of virtual reality system in inventive embodiments does not constitute it restriction.
Referring to Fig. 1, Fig. 1 is a kind of image processing method one embodiment flow diagram of the embodiment of the present invention, including:
101, virtual reality device obtains the color value of light emitting source.
Virtual reality device simulates the scene effect of real world by virtual scene, and light emitting source is right in virtual scene
The luminous object answered, the color value of light emitting source refer to the color value of the luminous object in virtual scene.
102, virtual reality device determines the image of virtual scene picture.
Virtual reality device determination will show the image of empty scenic picture, which is virtual scene
Under image.It should be noted that having no execution sequencing between step 102 and step 101, do not limit herein specifically.
103, virtual reality device determines target area from the image of virtual scene picture.
After virtual reality device determines the image of virtual scene picture, determined from the image of the virtual scene picture
The target area being irradiated to by light emitting source.
It should be noted that also without sequencing restriction is executed between step 103 and step 101, as long as so that step exists
103 execute after step 102.
104, virtual reality device determines target color values according to the color value of light emitting source and the color value of target area.
Behind the target area that light emitting source irradiates in the image that virtual reality device determines virtual scene picture, according to hair
The color value of light source and the color value of target area determine target color values, i.e. the target color values are the face by light emitting source
The color value that the color value of color value and target area is determined replaces the color value of target area with the target color values
Afterwards so that color when color when target area is shown is irradiated closer to object in reality scene by light source.
105, the color value of target area is replaced with target color values by virtual reality device.
As can be seen from the above technical solutions, the embodiment of the present invention proposes a kind of image processing method, and virtual reality is set
The standby color value for obtaining the light emitting source in virtual scene, determines the image of virtual scene picture, and from the figure of virtual scene picture
The target area irradiated by light emitting source is determined as in, and mesh is determined according to the color value of the color value of light emitting source and target area
Color value is marked, the color value of target area is finally replaced with into target color values.I.e. in embodiments of the present invention, a kind of figure is proposed
As processing method, for simulating lighting effect, directly really by the color value of light emitting source and the color value of target area institute
Fixed target color values are directly reflected by target color values to being replaced by the color value for the irradiated target area that shines
It is mapped on irradiated area, to obtain in virtual scene picture, the lighting effect of irradiated area avoids multipath reflection calculating
Process, reduce calculation amount, reduce the power consumption of virtual reality device.
In order to make it easy to understand, carrying out a detailed description to a kind of image processing method of the embodiment of the present invention below, ask
Referring to Fig.2, Fig. 2 is a kind of another embodiment schematic diagram of image processing method of the embodiment of the present invention, including:
201, virtual reality device obtains the color value of each pixel in the region corresponding to light emitting source.
The color value of each pixel in the region corresponding to light emitting source is the face for each pixel for referring to light emitting source region
Color value.
By taking virtual movie institute scene as an example, generally in true cinema, when watching film, only film screen
Curtain is luminous object, and in corresponding virtual scene, virtual reality device can obtain each of motion picture screen under virtual scene
The color value of pixel.
202, virtual reality device calculates the average value of the color value of each pixel, by being averaged for the color value of each pixel
It is worth the color value as light emitting source.
After virtual reality device obtains the color value of each pixel of light emitting source region, each pixel is calculated
The average value of color value, using the average value of the color value of each pixel of light emitting source region as the color value of light emitting source.
What needs to be explained here is that in some embodiments of the invention, virtual reality device obtains the color of light emitting source
Value can also refer to:
Region division corresponding to light emitting source is light point area and GLOW INCLUSION AREA by virtual reality device, obtains light point area
Each pixel color value, and obtain the color value of each pixel of GLOW INCLUSION AREA;
Virtual reality device calculates the first average value according to the color value of each pixel of light point area, and according to halo regions
The color value of each pixel in domain calculates the second average value;
Virtual reality device is weighted to obtain the first color value using first the first average value of weights pair, and uses
Second the second average value of weights pair is weighted to obtain the second color value, and the first weights are more than the second weights, wherein first
Weights and the second weights can be configured according to practical situations, not limited herein specifically.
The color value of light emitting source is calculated according to the first color value and the second color value for virtual reality device, wherein
Here can the first color value and the second color value be subjected to a simple mathematical operation and obtains the color value of light emitting source, example
First color value and the second color value can such as be carried out to a simple add operation, or calculate the first color value and
Spread out color value etc. of the average value of second color value as light emitting source, does not limit specifically herein.
It is further to note that other than aforesaid way, in some embodiments of the invention, the face of light emitting source is obtained
Color value can also refer to:
Virtual reality device obtains the color value of the pixel of the default value in the region corresponding to light emitting source, wherein pre-
Setting value is preconfigured numerical value, is not limited herein specifically, such as:It can be specifically the region taken corresponding to light emitting source
The color value of wherein 4 pixels finally calculates color value of the average value as light emitting source of this 4 pixels.
203, virtual reality device determines the reflection coefficient of target area.
It should be understood that there is different materials different reflectances to indicate the energy of the material reflection light that is, by reflecting rate
Power.In embodiments of the present invention, the material of the reflection coefficient of target area illuminated object corresponding with target area is reflective
Spend positive correlation.
For example, the reflectance positive correlation of the material of the reflection coefficient of target area illuminated object corresponding with target area
Can refer to specifically that the reflection coefficient reflectance of material of illuminated object corresponding to target area of target area is directly proportional
Example relationship or target area reflection coefficient with the corresponding illuminated object in target area material reflectance other
Positive correlation does not limit specifically herein.
204, virtual reality device determines attenuation coefficient according to the penetrating degree parameter of scene distance and scene.
Wherein, which is the scene distance between light emitting source and target area, can refer to light emitting source central point
Position does not limit specifically herein to the air line distance of target area center position.The penetrating degree parameter of scene refers to light emitting source
The penetrating parameter of scene between target area, it should be appreciated that in the reality scene actually answered, due to scene distance and scene
The influence of the factors such as penetrating degree can have the process of decaying in the light of scene transfer, corresponding, in embodiments of the present invention, empty
Quasi- real world devices determine decaying ginseng according to the penetrating parameter of scene between the light emitting source and target area of virtual scene, scene distance
Number, wherein the attenuation parameter and above-mentioned scene distance positive correlation, and it is negatively correlated with the penetrating parameter of scene.
What needs to be explained here is that in practical applications, as long as so that the attenuation parameter and above-mentioned scene distance positive correlation,
And with the penetrating parameter negative correlation of scene, do not limit herein specifically.
Sequencing relationship, step 203 and step are executed it should also be noted that, being had no between step 203 and step 204
201, step 202 does not limit specifically herein also without step priority qualified relation.
205, virtual reality device decays the light intensity parameter of light emitting source according to attenuation coefficient to obtain interim light intensity ginseng
Number.
It should be understood that during practical light propagation, due to the interference of scene distance and the penetrating parameter of scene, light emitting source hair
The light gone out can decay.In embodiments of the present invention, after virtual reality device determines attenuation parameter according to step 203, according to
The attenuation parameter is decayed to obtain interim light intensity parameter to the light intensity parameter of light emitting source, i.e. luminous intensity parameter, to simulate
The light intensity parameter that light emitting source is sent out under virtual scene.
206, virtual reality device is modified interim light intensity parameter by reflection coefficient to obtain target light intensity parameter.
After interim light intensity parameter is determined, since the unlike material of illuminated object has different reflection coefficients, because
This, in the embodiment of the present invention, virtual reality device according to the reflection coefficient of the material of illuminated object to interim light intensity parameter into
Row is corrected, and obtains target light intensity parameter, the light intensity parameter around illuminated object to simulate realistic objective region.
207, virtual reality device calculates the tone weights of target area according to target light intensity parameter.
After obtaining target light intensity parameter, the tone that virtual reality device calculates target area according to target light intensity parameter is weighed
Value.It, can be by the target light intensity parameter of acquisition directly as tone weights, or to mesh it should be noted that in practical applications
Mark light intensity parameter carries out simple mathematical operation and is not limited herein specifically to acquire the tone weights of target area.
208, virtual reality device is according to the tone of the color value of light emitting source, the color value of target area and target area
Weights determine target color values.
In embodiments of the present invention, virtual reality device according to the color value of light emitting source, the color value of target area and
The tone weights of target area determine that target color values include:
After the tone weights of target area are determined, virtual reality device is using tone weights to the color value of light emitting source
Enhanced to obtain the first weighted value, the color value of target area decayed using tone weights to obtain the second weighted value,
Using the absolute value of the first weighted value and the difference of the second weighted value as target color values, a simply example is lifted:
Assuming that the color value of light emitting source is green, the color value of target area is red, tone weights can be used to hair
The color value of light source is enhanced, i.e., enhances green, such as increases light emitting source green using calculated tone weights
Coloration or brightness etc., the color value of enhanced light emitting source, i.e. the first weighted value are finally obtained, using tone weights to target
The color value in region, i.e., decay to red, such as reduce red brightness or coloration etc., the target being finally attenuated
The color value in region, i.e. the second weighted value, finally using the difference of the first obtained weighted value and the second weighted value as target face
Color value, by taking color space is rgb color space as an example, it is assumed that the corresponding RGB of the first weighted value is (0,255,0), the second weighting
It is (255,0,0) to be worth corresponding rgb value, then the absolute value of the first weighted value and the difference of the second weighted value is (255,255,0),
Using the absolute value as target color values.It should be noted that illustrated by rgb color space of color space here,
In other color spaces, for example, YUV (also known as YCrCb) color space can also this analogize, do not limit herein, specifically herein
It repeats no more.
What needs to be explained here is that virtual reality device according to the color value of light emitting source, the color value of target area and
The tone weights of target area determine target color values other than the mode of foregoing description, according to the color value of light emitting source, target
The color value in region and the tone weights of target area, can also there is other modes for determining target color values, it is specific this
Place does not limit.
In addition it what needs to be explained here is that, in practical applications, by the color value of target area and can also shine
The color value in source determines target color values, such as the color value of the color value of target area and light emitting source is done one simply
Fusion calculation goes out target color values, such as:
LightColor=(color1+color2+color3+color4)/4;
* 0.3, the wherein face of LightColor light emitting sources Color=(LightColor* reflection coefficients+ObjectColor)
Color value, the average value of 4 pixels color1, color2, color3 and color4 by calculating light emitting source region
It obtains, Color is target color values, and ObjectColor is the color value of target area, and reflection coefficient refers to the target area of light emitting source
The reflection coefficient in domain.
209, the color value of target area is replaced with target color values by virtual reality device.
After virtual reality device determines target color values, the color value of target area is replaced with into the color of object
Value.
Value can be seen by above technical scheme, in embodiments of the present invention, a kind of image processing method is proposed, can simulate
Go out lighting effect, directly by target color values determined by the color value of light emitting source and the color value of target area, to quilt
The color value of luminous irradiated target area is replaced, i.e., is directly mapped on irradiated area by target color values,
To obtain in virtual scene picture, the lighting effect of irradiated area avoids the process of multipath reflection calculating, reduces calculating
Amount, reduces the power consumption of virtual reality device.
A kind of image processing method of the embodiment of the present invention is described above, is based on above-mentioned image processing method, under
It is described in face of a kind of image processing apparatus of the embodiment of the present invention:
Referring to Fig. 3, Fig. 3 is a kind of image processing apparatus one embodiment structural schematic diagram of the embodiment of the present invention, the figure
As processing unit include acquisition module 101, the first determining module 102, the second determining module 103, third determining module 104 and
Replacement module 105.
Wherein, acquisition module 101, the color value for obtaining light emitting source, the light emitting source are shining in virtual scene
Source;
First determining module 102, the image for determining virtual scene picture;
Second determining module 103, the image of the virtual scene picture for being determined from first determining module 102
Middle determining target area, the target area are the region irradiated by the light emitting source in the virtual scene;
Third determining module 104, the color value of the light emitting source for being obtained according to the acquisition module 101, and
The color value for the target area that second determining module 103 determines determines target color values;
The color value of replacement module 105, the target area for determining second determining module 103 replaces with
The target color values that the third determining module 104 determines.
In a kind of possible realization, the acquisition module 101 includes:
First acquisition unit 1011, the color value of each pixel for obtaining the region corresponding to the light emitting source;
First computing unit 1012, the color of each pixel for calculating the acquisition of the first acquisition unit 1011
The average value of value, using the average value of the color value of each pixel as the color value of the light emitting source.
In a kind of possible realization, the acquisition module 101 further includes:
Division unit 1013, for being light point area and GLOW INCLUSION AREA by the region division corresponding to the light emitting source;
Second acquisition unit 1014, each pixel of the light point area for obtaining the division of the division unit 1013
Color value, and obtain the division unit 1013 division GLOW INCLUSION AREA each pixel color value;
Second computing unit 1015, each picture of the light point area for being obtained according to the second acquisition unit 1014
The color value of vegetarian refreshments calculates the first average value, and each picture of the GLOW INCLUSION AREA obtained according to the second acquisition unit 1014
The color value of vegetarian refreshments calculates the second average value;
Second computing unit 1015 is also used for the first weights and first average value is weighted
It is weighted to obtain the second color value to second average value to the first color value, and using the second weights, described
One weights are more than second weights;
Second computing unit 1015 is additionally operable to calculate the hair according to first color value and the second color value
The color value of light source.
In a kind of possible realization, the third module 104 includes:
First determination unit 1041, the reflection coefficient for determining the target area, the reflection coefficient and the mesh
Mark the reflectance positive correlation of the material of the illuminated object corresponding to region;
Third computing unit 1042, for according between the reflection coefficient, the light emitting source and the target area
Scene distance and the penetrating degree parameter of scene calculate the tone weights of the target area;
Second determination unit 1043, for according to the color value of the light emitting source, the color value of the target area and
The tone weights of the target area determine target color values.
In a kind of possible realization, the third computing unit 1042 is specifically used for:
Determine attenuation coefficient according to the scene distance and the penetrating degree parameter of the scene, the attenuation coefficient with it is described
Scene distance positive correlation, and it is negatively correlated with the penetrating degree parameter of the scene;
The light intensity parameter of the light emitting source is decayed according to the attenuation coefficient to obtain interim light intensity parameter;
The interim light intensity parameter is modified to obtain target light intensity parameter by the reflection coefficient;
The tone weights of the target area are calculated according to the target light intensity parameter.
It can be seen that the embodiment of the present invention proposes a kind of image processing apparatus, by obtaining shining in virtual scene
The color value in source determines the image of virtual scene picture, and target area is determined from the image of virtual scene picture, wherein should
Target area is the region irradiated by light emitting source, according to the color value of light emitting source and the face of target area in virtual scene
Color value determines target color values, and the color value of target area is finally replaced with target color values.I.e. in embodiments of the present invention,
It proposes another method for simulating lighting effect, directly passes through the color value of light emitting source and the color value of target area institute
Determining target color values are replaced to obtain the lighting effect of simulation virtual scene picture the color value of target area,
The process for avoiding multipath reflection calculating, reduces calculation amount, reduces the power consumption of virtual reality device.
It is apparent to those skilled in the art that for convenience and simplicity of description, the device of foregoing description,
The specific work process and more details of module and unit, can refer to corresponding processes in the foregoing method embodiment,
This is repeated no more.
In several embodiments provided herein, it should be understood that disclosed system, device and method can be with
It realizes by another way.For example, the apparatus embodiments described above are merely exemplary, for example, the unit
It divides, only a kind of division of logic function, formula that in actual implementation, there may be another division manner, such as multiple units or component
It can be combined or can be integrated into another system, or some features can be ignored or not executed.Another point, it is shown or
The mutual coupling, direct-coupling or communication connection discussed can be the indirect coupling by some interfaces, device or unit
It closes or communicates to connect, can be electrical, machinery or other forms.
The unit illustrated as separating component may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, you can be located at a place, or may be distributed over multiple
In network element.Some or all of unit therein can be selected according to the actual needs to realize the mesh of this embodiment scheme
's.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, it can also
It is that each unit physically exists alone, it can also be during two or more units be integrated in one unit.Above-mentioned integrated list
The form that hardware had both may be used in member is realized, can also be realized in the form of SFU software functional unit.
If the integrated unit is realized in the form of SFU software functional unit and sells or use as independent product
When, it can be stored in a computer read/write memory medium.Based on this understanding, technical scheme of the present invention is substantially
The all or part of the part that contributes to existing technology or the technical solution can be in the form of software products in other words
It embodies, which is stored in a storage medium, including some instructions are used so that a computer
Equipment (can be personal computer, server or the network equipment etc.) executes the complete of each embodiment the method for the present invention
Portion or part steps.And storage medium above-mentioned includes:USB flash disk, mobile hard disk, read-only memory (Read-Only Memory,
ROM), random access memory (Random Access Memory, RAM), magnetic disc or CD etc. are various can store program
The medium of code.
The above, the above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although with reference to before
Stating embodiment, invention is explained in detail, it will be understood by those of ordinary skill in the art that:It still can be to preceding
The technical solution recorded in each embodiment is stated to modify or equivalent replacement of some of the technical features;And these
Modification or replacement, the spirit and scope for various embodiments of the present invention technical solution that it does not separate the essence of the corresponding technical solution.
Claims (10)
1. a kind of image processing method, which is characterized in that including:
Virtual reality device obtains the color value of light emitting source, and the light emitting source is the light emitting source in virtual scene;
The virtual reality device determines the image of virtual scene picture;
The virtual reality device determines target area from the image of the virtual scene picture, and the target area is described
In virtual scene, the region irradiated by the light emitting source;
The virtual reality device determines target face according to the color value of the light emitting source and the color value of the target area
Color value;
The color value of the target area is replaced with the target color values by the virtual reality device.
2. according to the method described in claim 1, it is characterized in that, the virtual reality device obtains the color value packet of light emitting source
It includes:
The virtual reality device obtains the color value of each pixel in the region corresponding to the light emitting source;
The virtual reality device calculates the average value of the color value of each pixel, by the color value of each pixel
Color value of the average value as the light emitting source.
3. according to the method described in claim 1, it is characterized in that, the virtual reality device obtains the color value packet of light emitting source
It includes:
Region division corresponding to the light emitting source is light point area and GLOW INCLUSION AREA by the virtual reality device;
The virtual reality device obtains the color value of each pixel of the light point area, and obtains each of the GLOW INCLUSION AREA
The color value of pixel;
The virtual reality device calculates the first average value according to the color value of each pixel of the light point area, and according to institute
The color value for stating each pixel of GLOW INCLUSION AREA calculates the second average value;
The virtual reality device is weighted first average value using the first weights to obtain the first color value, and
It is weighted to obtain the second color value to second average value using the second weights, first weights are more than described the
Two weights;
The color of the light emitting source is calculated according to first color value and the second color value for the virtual reality device
Value.
4. according to the method in any one of claims 1 to 3, which is characterized in that the virtual reality device is according to
The color value of light emitting source and the color value of the target area determine that target color values include:
The virtual reality device determines that the reflection coefficient of the target area, the reflection coefficient and target area institute are right
The reflectance positive correlation of the material for the illuminated object answered;
The virtual reality device according to the scene distance between the reflection coefficient, the light emitting source and the target area with
And the penetrating degree parameter of scene calculates the tone weights of the target area;
The virtual reality device is according to the color value of the light emitting source, the color value of the target area and the target area
The tone weights in domain determine target color values.
5. according to the method described in claim 4, it is characterized in that, the virtual reality device is according to the reflection coefficient, institute
State the tone that the penetrating degree parameter of scene distance and scene between light emitting source and the target area calculates the target area
Weights include:
The virtual reality device determines attenuation coefficient according to the scene distance and the penetrating degree parameter of the scene, described to decline
Subtract coefficient and the scene distance positive correlation, and negatively correlated with the penetrating degree parameter of the scene;
The virtual reality device is decayed to obtain interim light according to the attenuation coefficient to the light intensity parameter of the light emitting source
Strong parameter;
The virtual reality device is modified the interim light intensity parameter by the reflection coefficient to obtain target light intensity ginseng
Number;
The virtual reality device calculates the tone weights of the target area according to the target light intensity parameter.
6. a kind of image processing apparatus, which is characterized in that including:
Acquisition module, the color value for obtaining light emitting source, the light emitting source are the light emitting source in virtual scene;
First determining module, the image for determining virtual scene picture;
Second determining module, for determining target from the image for the virtual scene picture that first determining module determines
Region, the target area are the region irradiated by the light emitting source in the virtual scene;
Third determining module, the color value of the light emitting source for being obtained according to the acquisition module and described second is really
The color value for the target area that cover half block determines determines target color values;
Replacement module, it is true that the color value of the target area for determining second determining module replaces with the third
The target color values that cover half block determines.
7. device according to claim 6, which is characterized in that the acquisition module includes:
First acquisition unit, the color value of each pixel for obtaining the region corresponding to the light emitting source;
First computing unit, the average value of the color value for calculating each pixel that the first acquisition unit obtains,
Using the average value of the color value of each pixel as the color value of the light emitting source.
8. device according to claim 6, which is characterized in that the acquisition module includes:
Division unit, for being light point area and GLOW INCLUSION AREA by the region division corresponding to the light emitting source;
Second acquisition unit, the color value of each pixel for obtaining the light point area that the division unit divides, and
Obtain the color value of each pixel for the GLOW INCLUSION AREA that the division unit divides;
Second computing unit, the color value of each pixel of the light point area for being obtained according to the second acquisition unit
The first average value is calculated, and the color value of each pixel of the GLOW INCLUSION AREA obtained according to the second acquisition unit calculates
Second average value;
Second computing unit is also used for the first weights and is weighted to obtain the first face to first average value
Color value, and using the second weights second average value is weighted to obtain the second color value, first weights are big
In second weights;
Second computing unit is additionally operable to calculate the face of the light emitting source according to first color value and the second color value
Color value.
9. the device according to any one of claim 6 to 8, which is characterized in that the third determining module includes:
First determination unit, the reflection coefficient for determining the target area, the reflection coefficient and the target area institute
The reflectance positive correlation of the material of corresponding illuminated object;
Third computing unit, for according to the scene distance between the reflection coefficient, the light emitting source and the target area
And the penetrating degree parameter of scene calculates the tone weights of the target area;
Second determination unit, for according to the color value of the light emitting source, the color value of the target area and the target
The tone weights in region determine target color values.
10. device according to claim 9, which is characterized in that the third computing unit is specifically used for:
Attenuation coefficient, the attenuation coefficient and the scene are determined according to the scene distance and the penetrating degree parameter of the scene
Apart from positive correlation, and it is negatively correlated with the penetrating degree parameter of the scene;
The light intensity parameter of the light emitting source is decayed according to the attenuation coefficient to obtain interim light intensity parameter;
The interim light intensity parameter is modified to obtain target light intensity parameter by the reflection coefficient;
The tone weights of the target area are calculated according to the target light intensity parameter.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611082577.7A CN106534835B (en) | 2016-11-30 | 2016-11-30 | A kind of image processing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611082577.7A CN106534835B (en) | 2016-11-30 | 2016-11-30 | A kind of image processing method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106534835A CN106534835A (en) | 2017-03-22 |
CN106534835B true CN106534835B (en) | 2018-08-07 |
Family
ID=58355378
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611082577.7A Active CN106534835B (en) | 2016-11-30 | 2016-11-30 | A kind of image processing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106534835B (en) |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107203266A (en) * | 2017-05-17 | 2017-09-26 | 东莞市华睿电子科技有限公司 | A kind of data processing method based on VR |
CN107193377A (en) * | 2017-05-17 | 2017-09-22 | 东莞市华睿电子科技有限公司 | A kind of method for displaying image based on VR |
CN107220959A (en) * | 2017-05-17 | 2017-09-29 | 东莞市华睿电子科技有限公司 | A kind of image processing method based on unmanned plane |
CN108701372B8 (en) * | 2017-05-19 | 2021-07-27 | 华为技术有限公司 | Image processing method and device |
US10922878B2 (en) * | 2017-10-04 | 2021-02-16 | Google Llc | Lighting for inserted content |
CN107635123B (en) | 2017-10-30 | 2019-07-19 | Oppo广东移动通信有限公司 | White balancing treatment method and device, electronic device and computer readable storage medium |
CN108063926B (en) * | 2017-12-25 | 2020-01-10 | Oppo广东移动通信有限公司 | Image processing method and device, computer readable storage medium and computer device |
CN107959851B (en) * | 2017-12-25 | 2019-07-19 | Oppo广东移动通信有限公司 | Colour temperature detection method and device, computer readable storage medium and computer equipment |
CN108012135B (en) * | 2017-12-25 | 2019-09-06 | Oppo广东移动通信有限公司 | Image processing method and device, computer readable storage medium and computer equipment |
CN108063934B (en) * | 2017-12-25 | 2020-01-10 | Oppo广东移动通信有限公司 | Image processing method and device, computer readable storage medium and computer device |
CN108063933B (en) * | 2017-12-25 | 2020-01-10 | Oppo广东移动通信有限公司 | Image processing method and device, computer readable storage medium and computer device |
CN107959842B (en) * | 2017-12-25 | 2019-06-07 | Oppo广东移动通信有限公司 | Image processing method and device, computer readable storage medium and computer equipment |
CN108174172B (en) * | 2017-12-25 | 2019-09-06 | Oppo广东移动通信有限公司 | Image pickup method and device, computer readable storage medium and computer equipment |
CN108174173B (en) * | 2017-12-25 | 2020-01-10 | Oppo广东移动通信有限公司 | Photographing method and apparatus, computer-readable storage medium, and computer device |
CN108156434B (en) * | 2017-12-25 | 2019-07-05 | Oppo广东移动通信有限公司 | Image processing method and device, computer readable storage medium and computer equipment |
CN107959843B (en) * | 2017-12-25 | 2019-07-05 | Oppo广东移动通信有限公司 | Image processing method and device, computer readable storage medium and computer equipment |
CN108335362B (en) * | 2018-01-16 | 2021-11-12 | 重庆爱奇艺智能科技有限公司 | Light control method and device in virtual scene and VR (virtual reality) equipment |
CN111080801A (en) * | 2019-12-26 | 2020-04-28 | 北京像素软件科技股份有限公司 | Virtual model display method and device |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101458823A (en) * | 2008-12-19 | 2009-06-17 | 北京航空航天大学 | Real-time lighting drawing method under virtual stage environment |
CN103155004A (en) * | 2010-09-01 | 2013-06-12 | 马斯科公司 | Apparatus, system, and method for demonstrating a lighting solution by image rendering |
CN103606182A (en) * | 2013-11-19 | 2014-02-26 | 华为技术有限公司 | Method and device for image rendering |
CN104639843A (en) * | 2014-12-31 | 2015-05-20 | 小米科技有限责任公司 | Method and device for processing image |
WO2015127146A1 (en) * | 2014-02-19 | 2015-08-27 | Evergaze, Inc. | Apparatus and method for improving, augmenting or enhancing vision |
CN204964878U (en) * | 2015-08-12 | 2016-01-13 | 毛颖 | Can eliminate head -mounted display apparatus of colour difference |
CN105741343A (en) * | 2016-01-28 | 2016-07-06 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN105991987A (en) * | 2016-06-07 | 2016-10-05 | 深圳市金立通信设备有限公司 | Image processing method, equipment and system |
-
2016
- 2016-11-30 CN CN201611082577.7A patent/CN106534835B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101458823A (en) * | 2008-12-19 | 2009-06-17 | 北京航空航天大学 | Real-time lighting drawing method under virtual stage environment |
CN103155004A (en) * | 2010-09-01 | 2013-06-12 | 马斯科公司 | Apparatus, system, and method for demonstrating a lighting solution by image rendering |
CN103606182A (en) * | 2013-11-19 | 2014-02-26 | 华为技术有限公司 | Method and device for image rendering |
WO2015127146A1 (en) * | 2014-02-19 | 2015-08-27 | Evergaze, Inc. | Apparatus and method for improving, augmenting or enhancing vision |
CN104639843A (en) * | 2014-12-31 | 2015-05-20 | 小米科技有限责任公司 | Method and device for processing image |
CN204964878U (en) * | 2015-08-12 | 2016-01-13 | 毛颖 | Can eliminate head -mounted display apparatus of colour difference |
CN105741343A (en) * | 2016-01-28 | 2016-07-06 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN105991987A (en) * | 2016-06-07 | 2016-10-05 | 深圳市金立通信设备有限公司 | Image processing method, equipment and system |
Also Published As
Publication number | Publication date |
---|---|
CN106534835A (en) | 2017-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106534835B (en) | A kind of image processing method and device | |
CN108876931B (en) | Three-dimensional object color adjustment method and device, computer equipment and computer readable storage medium | |
US6876362B1 (en) | Omnidirectional shadow texture mapping | |
JP4276178B2 (en) | Method for digital rendering of skin or similar | |
CN102540464B (en) | Head-mounted display device which provides surround video | |
CN110163054A (en) | A kind of face three-dimensional image generating method and device | |
Jimenez et al. | Real-time realistic skin translucency | |
US11823316B2 (en) | Photoreal character configurations for spatial computing | |
CN114175097A (en) | Generating potential texture proxies for object class modeling | |
CN113610955A (en) | Object rendering method and device and shader | |
Chalmers et al. | Levels of realism: From virtual reality to real virtuality | |
CN116958344A (en) | Animation generation method and device for virtual image, computer equipment and storage medium | |
US6753875B2 (en) | System and method for rendering a texture map utilizing an illumination modulation value | |
CN112258621A (en) | Method for observing three-dimensional rendering two-dimensional animation in real time | |
WO2022042003A1 (en) | Three-dimensional coloring method and apparatus, and computing device and storage medium | |
González et al. | based ambient occlusion | |
WO2023094870A1 (en) | Increasing dynamic range of a virtual production display | |
Gigilashvili et al. | Appearance manipulation in spatial augmented reality using image differences | |
CN116137051A (en) | Water surface rendering method, device, equipment and storage medium | |
Mahmud et al. | Surrounding-aware screen-space-global-illumination using generative adversarial network | |
CN116421970B (en) | Method, device, computer equipment and storage medium for externally-installed rendering of virtual object | |
CN117218271A (en) | Dough sheet generation method and device, storage medium and electronic equipment | |
CN107635119A (en) | Projective techniques and equipment | |
Hu et al. | Real-time depth-of-field rendering for shadow | |
Zhao | Dynamic Light and Shadow Rendering Algorithm of VR Scene Based on Global Illumination and Deep Learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |