CN108305328A - Dummy object rendering intent, system, medium and computing device - Google Patents

Dummy object rendering intent, system, medium and computing device Download PDF

Info

Publication number
CN108305328A
CN108305328A CN201810131051.6A CN201810131051A CN108305328A CN 108305328 A CN108305328 A CN 108305328A CN 201810131051 A CN201810131051 A CN 201810131051A CN 108305328 A CN108305328 A CN 108305328A
Authority
CN
China
Prior art keywords
dummy object
information
mobile terminal
irradiation
true environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810131051.6A
Other languages
Chinese (zh)
Inventor
朱斯衎
陈艳蕾
赵辰
丛林
邵文坚
郭于晨
上官福豪
刘
唐秦崴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Yixian Advanced Technology Co ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201810131051.6A priority Critical patent/CN108305328A/en
Publication of CN108305328A publication Critical patent/CN108305328A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models

Abstract

Embodiments of the present invention provide a kind of dummy object rendering intent for mobile terminal, including:The ambient light information of true environment is obtained by the camera of mobile terminal;Based on ambient light information, determine that dummy object is subject at the position in being inserted into true environment by irradiation by irradiation information;And based on by irradiation information, dummy object is rendered.By based on dummy object be subject at the position in being inserted into true environment by irradiation by irradiation information, dummy object is rendered, the method of the present invention makes dummy object can be with real-time response true environment illumination variation, it is consistent with the illumination in true environment, to be obviously improved the sense of reality of dummy object, so that the combination of dummy object and true environment is even closer, it is natural.In addition, embodiments of the present invention provide a kind of dummy object rendering system, a kind of medium and a kind of computing device for mobile terminal.

Description

Dummy object rendering intent, system, medium and computing device
Technical field
Embodiments of the present invention are related to smart machine field, more specifically, embodiments of the present invention are related to a kind of use In dummy object rendering intent, system, medium and the computing device of mobile terminal.
Background technology
Background that this section is intended to provide an explanation of the embodiments of the present invention set forth in the claims or context.Herein Description recognizes it is the prior art not because not being included in this part.
Augmented reality (Augmented Reality, referred to as AR) is the position shot by video camera and angle actuarial, And image analysis technology is added, the virtual objects such as dummy object, the scene of computer generation are combined with real world Technology.The video camera generally has the basic functions such as video photography/propagation and still image capture, is acquired in the camera lens of video camera After color of image, by camera photosensory assembly circuit and control assembly image is carried out being processed and converted to computer to identify Digital signal be input to after computer and image restoring carried out by software again then by parallel port, USB connections.
Currently, having occurred some is fused to virtual objects the technology for carrying out augmented reality scene in reality scene, so And during realizing disclosure design, inventor has found that at least there are the following problems in the related technology:
The technology for the augmented reality scene that the relevant technologies provide, since the lighting effect of virtual objects is to pass through virtual light source It realizes, as the color on dummy object surface determines to the diffusing reflection of virtual light source and the color of virtual light source itself by dummy object It is fixed, the lighting effect of true environment illumination can not be showed, causes the sense of reality of dummy object not strong.
Invention content
As above-mentioned, since the lighting effect of virtual objects is realized by virtual light source in the related technology, such as virtual object The color in body surface face determines the diffusing reflection of virtual light source and the color of virtual light source itself by dummy object.
Therefore in the prior art, dummy object can not show the lighting effect of true environment illumination, lead to virtual object The sense of reality of body is not strong, this is very bothersome process.
Thus, it is also very desirable to a kind of improved dummy object rendering intent, so that dummy object shows true environment light According to lighting effect, enhance the sense of reality of dummy object.
In the present context, embodiments of the present invention are intended to provide what a kind of dummy object for mobile terminal rendered Method and its system.
In the first aspect of embodiment of the present invention, a kind of dummy object rendering side for mobile terminal is provided Method, including:The ambient light information of true environment is obtained by the camera of the mobile terminal;Based on the ambient light information, Determine that dummy object is subject at the position in being inserted into the true environment by irradiation by irradiation information;And base In described by irradiation information, the dummy object is rendered.
In one embodiment of the invention, the method further includes:Pass through the Inertial Measurement Unit of the mobile terminal To obtain the posture information of mobile terminal.
In one embodiment of the invention, the ambient light for true environment being obtained by the camera of the mobile terminal is believed Breath includes:Multiple dynamic scene figures are obtained by the camera of the mobile terminal shoots the true environment;And to institute It states multiple dynamic scene figures to be matched, obtains the environment ball of the true environment.
In another embodiment of the present invention, the method further includes:More according to the posture information of the mobile terminal The new ambient light information.
In one embodiment of the invention, by irradiation information described in being based on, carrying out rendering to the dummy object includes: According to described by irradiation information, Lighting information of the dummy object in all directions is determined;And it is based on the Lighting information, The dummy object is rendered.
In one embodiment of the invention, determined the dummy object in all directions by irradiation information according to described Lighting information include:According to described by irradiation information, light of the dummy object in all directions is determined using PBR algorithms According to information.
In one embodiment of the invention, the Lighting information includes radiation direction, light intensity and/or light face Color.
In one embodiment of the invention, for the dummy object of translucent appearance, the dummy object is in all directions On the light intensity include the light intensity that scattering and transmission are calculated based on Subsurface Scattering principle.
In one embodiment of the invention, the dummy object of the translucent appearance includes:Jade, candle, milk or Skin.
In one embodiment of the invention, the posture of mobile terminal is obtained by the Inertial Measurement Unit of mobile terminal Information includes:Using 9 number of axle of the Inertial Measurement Unit of the mobile terminal appearance of mobile terminal is obtained according to quaternary number is determined State information.
In the second aspect of embodiment of the present invention, a kind of dummy object rendering system for mobile terminal is provided System, including:First acquisition module, the ambient light information for obtaining true environment by the camera of the mobile terminal;Really Cover half block determines position place of the dummy object in being inserted into the true environment for being based on the ambient light information Be subject to by irradiation by irradiation information;And rendering module, for based on described by irradiation information, to the dummy object into Row renders.
In one embodiment of the invention, the system also includes:Second acquisition module, for by described mobile whole The Inertial Measurement Unit at end obtains the posture information of mobile terminal.
In one embodiment of the invention, the first acquisition module includes:Shooting unit, for passing through the mobile terminal Camera shoot the true environment and obtain multiple dynamic scene figures;And matching unit, for the multiple dynamic Scene graph is matched, and the environment ball of the true environment is obtained.
In one embodiment of the invention, the system also includes:Update module, for according to the mobile terminal Posture information updates the ambient light information.
In one embodiment of the invention, rendering module includes:Determination unit, for according to described by irradiation information, Determine Lighting information of the dummy object in all directions;And rendering unit, for being based on the Lighting information, to described Dummy object is rendered.
In one embodiment of the invention, determination unit is additionally operable to:According to described by irradiation information, PBR algorithms are utilized To determine Lighting information of the dummy object in all directions.
In one embodiment of the invention, the Lighting information includes radiation direction, light intensity and/or light face Color.
In one embodiment of the invention, for the dummy object of translucent appearance, the dummy object is in all directions On the light intensity include the light intensity that scattering and transmission are calculated based on Subsurface Scattering principle.
In one embodiment of the invention, the dummy object of the translucent appearance includes:Jade, candle, milk or Skin.
In one embodiment of the invention, the second acquisition module is additionally operable to:Utilize the inertia measurement of the mobile terminal 9 number of axle of unit obtain the posture information of mobile terminal according to quaternary number is determined.
In the third aspect of embodiment of the present invention, a kind of medium is provided, computer executable instructions are stored with, on Instruction is stated when being executed by processing unit for realizing the dummy object rendering intent described in any one of above-described embodiment.
In the fourth aspect of embodiment of the present invention, a kind of computing device is provided, including:Processing unit;And it deposits Storage unit is stored with computer executable instructions, and above-metioned instruction by processing unit when being executed for realizing in above-described embodiment The above-mentioned dummy object rendering intent of any one.
According to the dummy object rendering intent and its system for mobile terminal of embodiment of the present invention, base can be passed through In dummy object be subject at the position in being inserted into true environment by irradiation by irradiation information, to dummy object into Row renders, and method of the invention makes dummy object can be with real-time response true environment illumination variation, with the illumination in true environment It being consistent, to be obviously improved the sense of reality of dummy object so that the combination of dummy object and true environment is even closer, It is natural.
Description of the drawings
Detailed description below, above-mentioned and other mesh of exemplary embodiment of the invention are read by reference to attached drawing , feature and advantage will become prone to understand.In the accompanying drawings, if showing the present invention's by way of example rather than limitation Dry embodiment, wherein:
Figure 1A and Figure 1B schematically shows the dummy object according to the ... of the embodiment of the present invention for mobile terminal and renders Method and its systematic difference scene;
Fig. 2 schematically shows the streams of the dummy object rendering intent according to the ... of the embodiment of the present invention for mobile terminal Cheng Tu;
Fig. 3 A schematically show the dummy object rendering side according to another embodiment of the present invention for mobile terminal The flow chart of method;
Fig. 3 B schematically show the camera according to the ... of the embodiment of the present invention by mobile terminal and obtain true environment Ambient light information flow chart;
Fig. 3 C schematically show the dummy object rendering side for mobile terminal according to further embodiment of this invention The flow chart of method;
Fig. 3 D schematically show according to the ... of the embodiment of the present invention based on by irradiation information, are rendered to dummy object Flow chart;
Fig. 4 schematically shows the frames of the dummy object rendering system according to the ... of the embodiment of the present invention for mobile terminal Figure;
Fig. 5 A schematically show the dummy object according to another embodiment of the present invention for mobile terminal and render system The block diagram of system;
Fig. 5 B schematically show the block diagram of the first acquisition module according to the ... of the embodiment of the present invention;
Fig. 5 C are schematically shown renders system according to the dummy object for mobile terminal of further embodiment of this invention The block diagram of system;
Fig. 5 D schematically show the block diagram of rendering module according to the ... of the embodiment of the present invention;
Fig. 6 schematically shows the schematic diagram of computer readable storage medium product according to the ... of the embodiment of the present invention;With And
Fig. 7 schematically shows the block diagram of computing device according to the ... of the embodiment of the present invention.
In the accompanying drawings, identical or corresponding label indicates identical or corresponding part.
Specific implementation mode
The principle and spirit of the invention are described below with reference to several illustrative embodiments.It should be appreciated that providing this A little embodiments are used for the purpose of making those skilled in the art can better understand that realizing the present invention in turn, and be not with any Mode limits the scope of the invention.On the contrary, these embodiments are provided so that the disclosure is more thorough and complete, and energy It is enough that the scope of the present disclosure is completely communicated to those skilled in the art.
One skilled in the art will appreciate that embodiments of the present invention can be implemented as a kind of system, device, equipment, method Or computer program product.Therefore, the disclosure can be with specific implementation is as follows, i.e.,:Complete hardware, complete software The form that (including firmware, resident software, microcode etc.) or hardware and software combine.
According to the embodiment of the present invention, it is proposed that a kind of for the dummy object rendering intent of mobile terminal, system, Jie Matter and computing device.
Herein, it is to be understood that involved term and abbreviation include mainly:Augmented reality (Augmented Reality, referred to as AR), refer to through the position of camera image and angle actuarial and adding image analysis technology, allows screen Upper virtual world can be combined and interaction with real-world scene.Color camera:Generally have video photography/propagation and Still image capture etc. basic functions, it be by camera lens acquire color of image after, by camera photosensory assembly circuit and Control assembly to image be processed and converted to the digital signal that computer can identify, then connects by parallel port, USB It connects, is input to after computer and image restoring is carried out by software again.Dummy object:A virtual three-dimensional space is generated using computer simulation In object, which is not present in reality scene.True environment:The scene of objective reality.Textures:In computer graphics In be be stored in memory bitmap package to 3D render object surface.Texture mapping provides abundant thin to object Section has simulated complicated appearance with simple mode.It renders:The process of image is generated from dummy object with computer program. Illumination:The light intensity of all directions in reality scene.Mobile terminal:It can carry, there is screen, computer can be run Program completes the portable equipment rendered.Computer graphics:Computer Graphics, referred to as CG.Material:Dummy object Surface properties, including the information such as roughness, metallicity, hardness.Virtual light source:The light source in virtual scene is existed only in, is used It is illuminated in being generated to dummy object.Diffusing reflection:Light is radiated at the phenomenon that coarse surface of object can disorderly be reflected around. It is translucent:The attribute for allowing light to penetrate.Transparent material can be had an X-rayed;I.e. they allow apparent image to pass through.Opposite category Property is referred to as opacity.Trnaslucent materials only allows some light to pass through between transparent material and opaque material.Light It can all be reflected when across trnaslucent materials and transparent material, so that image is twisted, trnaslucent materials can also go out Existing scattering phenomenon.Inertial Measurement Unit (Inertial Measurement Unit, referred to as IMU) is to measure three axis appearance of object The device of state angle (or angular speed) and acceleration.Environment ball textures:Mapping coordinates use the textures of three-dimensional polar system, are used for Indicate the light of environment all directions directive object.Reflection:A kind of physical phenomenon refers to wave and enters another medium from a medium When, direction of propagation suddenly change, and return to the medium in its source.Refraction:A kind of physical phenomenon refers to when object or fluctuates by one Kind medium is slanted through another medium and speed is caused to change and cause the offset in angle.Rendering based on physics (Physically based rendering, referred to as PBR), according to physics illumination model, is assigned virtual in render process The rendering intent of object proximity actual physical texture.In addition, any number of elements in attached drawing is used to example and unrestricted, with And any name is only used for distinguishing, without any restrictions meaning.
Below with reference to several representative embodiments of the present invention, the principle and spirit of the invention are illustrated in detail.
Summary of the invention
During realizing the embodiment of the present invention, inventor has found:The lighting effect of virtual objects is to pass through virtual optical Source realize, as dummy object surface color by dummy object to the diffusing reflection of virtual light source and the color of virtual light source itself It determines, the lighting effect of true environment illumination can not be showed, cause the sense of reality of dummy object not strong.
Embodiments of the present invention provide a kind of dummy object rendering intent for mobile terminal, including:Pass through shifting The camera of dynamic terminal obtains the ambient light information of true environment;Based on ambient light information, determine that dummy object is being inserted into Be subject at position in true environment by irradiation by irradiation information;And based on by irradiation information, to dummy object into Row renders.Believed by irradiation by irradiation by what is be subject at the position in being inserted into true environment based on dummy object Breath, renders dummy object, and method of the invention keeps dummy object and true with real-time response true environment illumination variation Illumination in real environment is consistent, to be obviously improved the sense of reality of dummy object so that dummy object and true environment Combination it is even closer, it is natural.
After the basic principle for describing the present invention, lower mask body introduces the various non-limiting embodiment party of the present invention Formula.
Application scenarios overview
The dummy object for mobile terminal that embodiment of the present invention is elaborated referring initially to Figure 1A and Figure 1B renders Method and its systematic difference scene.
Figure 1A schematically shows the application scenarios schematic diagram of dummy object rendering intent in the related technology, such as Figure 1A institutes Show, in the application scenarios of the augmented reality scene, the dummy object of computer generation or scene etc. can be inserted into true In scene, the combination and interaction of virtual scene and real scene are realized.Specifically, it in true environment may include real-world object 101,102 and the real light sources 103 for generating lighting effect, dummy object 104 and to virtual are irradiated to real-world object Object is irradiated the virtual light source 105 for generating lighting effect, and virtual light source exists only in virtual scene, not with real light sources Together, the color on dummy object surface is determined by the color of the diffusing reflection color of dummy object itself and virtual light source completely.
Figure 1B schematically shows the signal of dummy object rendering intent application scenarios according to an embodiment of the present invention Figure, as shown in Figure 1B, the dummy object that computer can be generated in the application scenarios of the augmented reality scene or scene etc. It is inserted into real scene, realizes the combination and interaction of virtual scene and real scene.Specifically, may include in true environment Real-world object 101,102, dummy object 104, the light source 103 in true environment, real light sources 103 are radiated at the table of dummy object Face by multiple reflections, diffusion be mixed to form suffered by dummy object by irradiation, be inserted into really by irradiation and dummy object Specific location in environment is related, dummy object can by the illumination consistent with true environment, can also real-time response it is true The variation of ambient lighting.
It should be understood that the number of real-world object, dummy object in Figure 1A and Figure 1B, real light sources and virtual light source is only It is schematical.According to needs are realized, can have any number of real-world object, dummy object, real light sources and virtual optical Source, application scenarios herein are not the restriction to the embodiment of the present invention.
It should be noted that 104 surface of dummy object can be smooth in the application scenarios, can be transparent material Can also be that can also be that made of opaque material, the application is to dummy object made of trnaslucent materials made of material The attributes such as shape and material are not construed as limiting.
Illustrative methods
With reference to the application scenarios of the embodiment of Figure 1B, describe to be shown according to the present invention with reference to figure 2, Fig. 3 A~3D The dummy object rendering intent for mobile terminal of example property embodiment.It should be noted that application scenarios are merely to just It is shown in understanding spirit and principles of the present invention, embodiments of the present invention are unrestricted in this regard.On the contrary, this hair Bright embodiment can be applied to applicable any scene.
An embodiment of the present invention provides a kind of dummy object rendering intents for mobile terminal.
Fig. 2 schematically shows the streams of the dummy object rendering intent according to the ... of the embodiment of the present invention for mobile terminal Cheng Tu.
As shown in Fig. 2, the dummy object rendering intent for being used for mobile terminal may include operation S210~S230.
In operation S210, the ambient light information of true environment is obtained by the camera of mobile terminal.
In operation S220, it is based on ambient light information, determines position place of the dummy object in being inserted into true environment Be subject to by irradiation by irradiation information.
Dummy object is rendered based on by irradiation information in operation S230.
According to an exemplary embodiment of the present, the ambient light information of true environment is obtained, ambient light is the light of true environment Source is radiated at the light that body surface is formed after multiple reflections, diffusion mixing.For example, can be taken the photograph by the color of mobile terminal As the scene graph of head shooting true environment, the matched environment ball of scene graph with true environment is obtained, environment ball is for indicating true The light of all directions in real environment.
It should be noted that Lighting information includes but not limited to:Radiation direction, light intensity, light color, wherein light Line direction is a vector, and light intensity and light color can be arbitrary existing color spaces, such as primaries pattern RGB (Red, Green, Blue), HSV (Hue, Saturation, Value) etc., details are not described herein.
Dummy object can be the dummy object in the space generated using computer equipment simulation.The virtual object of the simulation Body can be the object being not present in reality scene, can also be the object that prototype can be found in reality scene, herein not It limits.
In the related art, the illumination and material effect that dummy object is inserted into true environment are by the way that virtual optical is arranged What source was realized, the color on dummy object surface is determined by the color of the diffusing reflection color of dummy object itself and virtual light source completely It is fixed.
According to an exemplary embodiment of the present, it is based on ambient light information, it may be determined that go out dummy object and be inserted into very Be subject at position in real environment by irradiation by irradiation information.It is inserted into according to virtual object specific in true environment Position, can be determined based on ambient light suffered by dummy object by irradiation by irradiation information, it is to be understood that virtual object Body is inserted into the difference of the specific location in true environment, it is suffered may be different by irradiation information by irradiation, Ren Heke All the protecting in the present invention of the method by irradiation information by irradiation suffered by dummy object are determined based on ambient light to realize Within the scope of, details are not described herein.
According to an exemplary embodiment of the present, it is based on, by irradiation information, rendering dummy object.Dummy object is inserted After entering being determined by irradiation into true environment, it can be based on rendering dummy object by irradiation information, that is, use tricks Dummy object is generated the image of dummy object by calculation machine program.
Through the embodiment of the present invention, by being subject at the position in being inserted into true environment based on dummy object By irradiation by irradiation information, dummy object is rendered, method of the invention keeps dummy object true with real-time response Ambient lighting changes, and is consistent with the illumination in true environment, to be obviously improved the sense of reality of dummy object so that empty The combination of quasi- object and true environment is even closer, natural.
Fig. 3 A schematically show the dummy object rendering side according to another embodiment of the present invention for mobile terminal The flow chart of method.
As shown in Figure 3A, which may include operation S210, S220, S230 And S311.Wherein, operation S210, S220, S230 is similar with previous embodiment, and details are not described herein again.
In operation S311, the posture information of mobile terminal is obtained by the Inertial Measurement Unit of mobile terminal.
According to an exemplary embodiment of the present, Inertial Measurement Unit IMU is by gyroscope, and accelerometer organic assembling is to carry For more enriching accurate navigation information.Inertial Measurement Unit, by three-dimensional Magnetic Sensor, three-dimensional angular rate sensor, three-dimensional adds Velocity sensor and signal processing unit composition, the angular speed of three axis of exportable carrier, acceleration value measure the three of mobile terminal The device of axis attitude angle (or angular speed) and acceleration.
According to an exemplary embodiment of the present, it with the variation of mobile terminal posture, is obtained by the camera of mobile terminal It takes the ambient light information of true environment that may change, is based on ambient light information, determine that dummy object is being inserted into really What is be subject at position in environment may be also with changing by irradiation information by irradiation, therefore can also pass through movement The Inertial Measurement Unit of terminal obtains the posture information of mobile terminal, to mobile terminal provide data support to determine whether with The variation for mobile terminal posture is changed the ambient light information of the camera acquisition true environment of mobile terminal.
Obtaining the posture information of mobile terminal according to IMU data fusions, there are many algorithm, such as complementary filter, Kalman's filters Wave device (linear), Mahony&Madgwick filters etc., the embodiment of the present invention is not specifically limited, and specific algorithm is simultaneously It is not the emphasis of the present invention, details are not described herein again.
By taking Kalman filter as an example, posture information is obtained to data fusion and is briefly described, is not to the present invention The restriction of embodiment.
Kalman filter model assumes that the time of day at K moment is come from the state evolution at k-1 moment, and under meeting Formula:
xk=Fkxk-1+Bkuk+wk
Wherein, FkIt is to act on xk-1On state transition model (matrix/vector).
BkIt is to act on controller vector ukOn input control model.
wkIt is process noise, and it is zero to assume that it meets mean value, covariance matrix QkMultivariate normal distributions.
wk~N (O, Qk)
Moment k, to time of day xkOne measurement zkMeet following formula:
zk=Hkxk+vk
Wherein, HkIt is observation model, it is time of day space reflection at observation space, vkIt is observation noise, mean value is Zero, covariance matrix Rk, and Normal Distribution.
vk~N (O, Rk)
Noise { the x at original state and each moment0, w1..., wk, v1... vkBe regarded as mutually it is independent.
Through the embodiment of the present invention, due to obtaining the appearance of mobile terminal using the Inertial Measurement Unit by mobile terminal The technical solution of state information can provide data to mobile terminal and support, so that mobile terminal is adapted to according to the information of posture The ambient light information of the adjustment true environment of property has data basis.
Fig. 3 B schematically show the camera according to the ... of the embodiment of the present invention by mobile terminal and obtain true environment Ambient light information flow chart.
As shown in Figure 3B, the ambient light information that true environment should be obtained by the camera of mobile terminal may include operation S321 and S322.
In operation S321, multiple dynamic scene figures are obtained by the camera of mobile terminal shoots true environment.
In operation S322, multiple dynamic scene figures are matched, the environment ball of true environment is obtained.
According to an exemplary embodiment of the present, it can be stored with a large amount of real scene pictures in the database, and The light source that these real scene pictures have all carried true environment when being acquired by the camera of mobile terminal is believed Breath.Here, the type to picture does not limit, the dynamic scene figure (High-Dynamic that exemplary embodiment of the present provides Range, referred to as HDR), it is that comparatively the better picture of effect utilizes computer vision to one kind to multiple HDR scene figures Algorithm, image matching algorithm, for example, feature based point image matching algorithm, find in real scene pictures closest to passing through Camera shoots the picture of dynamic scene figure obtained from real scene.The information carried according to the picture can determine true field Environment ball in scape for reflecting the ambient light of the true environment light of directive object in all directions.
By the embodiment of the present disclosure, the dynamic scene figure of shooting is matched by computer vision algorithms make, is provided A kind of method that can accurately determine true environment ambient light, computational efficiency are high.
Fig. 3 C schematically show the dummy object rendering side for mobile terminal according to further embodiment of this invention The flow chart of method.
As shown in Figure 3 C, the dummy object rendering intent for being used for mobile terminal may include operation S210, S220, S230, S311 and S321.Wherein, operation S210, S220, S230, S311 is similar with previous embodiment, and details are not described herein again.
In operation S321, ambient light information is updated according to the posture information of mobile terminal.
According to an exemplary embodiment of the present, as described above, being moved in the Inertial Measurement Unit by mobile terminal After the posture information of dynamic terminal, rendering engine can constantly update ambient light information according to the posture information of mobile terminal, specifically Update mode, details are not described herein.
By the embodiment of the present disclosure, due to using the technical solution for updating ambient light information according to posture information so that true The ambient light information of real environment can respond the attitudes vibration of mobile terminal in real time, and can accurately reflect true environment always Ambient light information so that the combination of dummy object and true environment is even closer, natural.
Fig. 3 D are schematically shown according to the present invention is based on by irradiation information, the flow rendered to dummy object Figure.
As shown in Figure 3D, should be based on by irradiation information, it may include operation S341 and S342 to dummy object render.
In operation S341 Lighting information of the dummy object in all directions is determined according to by irradiation information.
In operation S342, it is based on Lighting information, dummy object is rendered.
In one embodiment of the invention, Lighting information includes radiation direction, light intensity and/or light color.
According to an exemplary embodiment of the present, based on dummy object at the position in being inserted into true environment institute by By irradiation by irradiation information, it may be determined that go out Lighting information of the dummy object in all directions, specifically, according to virtual object The surface parameter of body calculates the reflection of all directions on its surface (direction of visual lines and normal direction) and reflects the intensity of light, adopts The colouring information of the environment ball of sample respective direction achievees the effect that map true environment with this.
By the embodiment of the present disclosure, due to using according to by irradiation information, determining illumination of the dummy object in all directions The technical solution of information so that it is rendered by the Lighting information being radiated in all directions of dummy object, reach with The effect of true environment fusion, enhances the sense of reality of dummy object.
In one embodiment of the invention, according to by irradiation information, illumination letter of the dummy object in all directions is determined Breath includes:According to by irradiation information, Lighting information of the dummy object in all directions is determined using PBR algorithms.
According to an exemplary embodiment of the present, it provides and a kind of determines dummy object in all directions using PBR algorithms Lighting information embodiment, but the method for determining Lighting information of the dummy object in all directions is not limited only to this.
Specifically, PBR is a kind of rendering based on physics, in render process, according to physics illumination model, is assigned virtual The rendering intent of object proximity actual physical texture, since length is limited, the specific implementation step of this method that details are not described herein.
It can due to determining Lighting information of the dummy object in all directions using PBR algorithms by the embodiment of the present disclosure To assign dummy object close to actual physical texture so that dummy object can under the irradiation of the ambient light of true environment, with The object of true environment is consistent, and enhances the sense of reality.
In one embodiment of the invention, for the dummy object of translucent appearance, dummy object is in all directions Light intensity includes the light intensity that scattering and transmission are calculated based on Subsurface Scattering principle.
In one embodiment of the invention, the dummy object of translucent appearance includes:Jade, candle, milk or skin.
According to an exemplary embodiment of the present, transparent material can be had an X-rayed, and apparent image is allowed to pass through, opposite to be known as Opaque, trnaslucent materials only allows some light to pass through between transparent material and opaque material.Light is across semi-transparent It can all be reflected when bright material and transparent material, so that image is twisted, trnaslucent materials there is also scattering phenomenon.
It, can be according to Subsurface Scattering for having object of translucent appearance, such as jade, candle, milk, skin etc. Principle calculates the light intensity of scattering and transmission, and the colouring information for then sampling environment ball is rendered.
By the embodiment of the present disclosure, since double of transparent virtual object uses different rendering intents, it is contemplated that virtual object The difference of body appearance material, and the sense of reality of dummy object is done according to the appearance material of dummy object and is changed accordingly Become so that the rendering intent scope of application of dummy object is wider.
In one embodiment of the invention, the posture of mobile terminal is obtained by the Inertial Measurement Unit of mobile terminal Information includes:Using 9 number of axle of the Inertial Measurement Unit of mobile terminal the posture letter of mobile terminal is obtained according to quaternary number is determined Breath.
According to an exemplary embodiment of the present, 9 number of axle according to may include three-dimensional (X/Y/Z) magnetic that measures of Magnetic Sensor to Amount, three-dimensional (X/Y/Z) angular rate data that angular rate sensor measures, the three-dimensional (X/Y/Z) that acceleration transducer measures accelerate Degrees of data can finally determine the posture information of mobile terminal according to quaternary number by 9 number of axle according to that can determine quaternary number.
Specifically, the gyroscope in IMU, output be the rotation of adjacent moment camera angle, the accelerometer of IMU is defeated What is gone out is the acceleration of adjacent moment camera, i.e. the change rate of speed, and specific algorithm is not the emphasis of the present invention, herein no longer It repeats.
By the embodiment of the present disclosure, since 9 number of axle using IMU obtain the posture letter of mobile terminal according to quaternary number is determined The technical solution of breath so that the result of calculation of posture information is accurate, the true posture of reflection mobile terminal that can be objective and accurate Information.
Exemplary system
After describing the method for exemplary embodiment of the invention, next, with reference to figure 4, Fig. 5 A~Fig. 5 D to this Invention illustrative embodiments, be described in detail for the dummy object rendering system of mobile terminal.
An embodiment of the present invention provides a kind of dummy object rendering systems for mobile terminal.
Fig. 4 schematically shows the frames of the dummy object rendering system according to the ... of the embodiment of the present invention for mobile terminal Figure.
As shown in figure 4, the system 400 may include the first acquisition module 410, determining module 420 and rendering module 430. Wherein:
First acquisition module 410 is used to obtain the ambient light information of true environment by the camera of mobile terminal.
Determining module 420 is used to be based on ambient light information, determines position of the dummy object in being inserted into true environment Place be subject to by irradiation by irradiation information.
Rendering module 430 is used to, based on by irradiation information, render dummy object.
By the embodiment of the present disclosure, pass through what is be subject at the position in being inserted into true environment based on dummy object By irradiation by irradiation information, dummy object is rendered, method of the invention keeps dummy object true with real-time response Ambient lighting changes, and is consistent with the illumination in true environment, to be obviously improved the sense of reality of dummy object so that empty The combination of quasi- object and true environment is even closer, natural.
Fig. 5 A schematically show the dummy object according to another embodiment of the present invention for mobile terminal and render system The block diagram of system.
As shown in Figure 5A, which may include the first acquisition module 410, determining module 420,430 and of rendering module Second acquisition module 511.Wherein the second acquisition module 511 obtains mobile whole for the Inertial Measurement Unit by mobile terminal The posture information at end.
By the embodiment of the present disclosure, due to obtaining the appearance of mobile terminal using the Inertial Measurement Unit by mobile terminal The technical solution of state information can provide data to mobile terminal and support, so that mobile terminal is adapted to according to the information of posture The ambient light information of the adjustment true environment of property has data basis.
Fig. 5 B schematically show the block diagram of the first acquisition module according to the ... of the embodiment of the present invention.
As shown in Figure 5 B, the first acquisition module 410 includes shooting unit 521 and matching unit 522.Wherein, shooting unit 521 by the camera of mobile terminal shoots the true environment for obtaining multiple dynamic scene figures.Matching unit 522 is used It is matched in multiple dynamic scene figures, obtains the environment ball of true environment.
By the embodiment of the present disclosure, the dynamic scene figure of shooting is matched by computer vision algorithms make, is provided A kind of method that can accurately determine true environment ambient light, computational efficiency are high.
Fig. 5 C are schematically shown renders system according to the dummy object for mobile terminal of further embodiment of this invention The block diagram of system.
As shown in Figure 5 C, the system 400 may include the first acquisition module 410, determining module 420, rendering module 430, Second acquisition module 511 and update module 531.Wherein update module 531 is used to update ring according to the posture information of mobile terminal Border optical information.
By the embodiment of the present disclosure, due to using the technical solution for updating ambient light information according to posture information so that true The ambient light information of real environment can respond the attitudes vibration of mobile terminal in real time, and can accurately reflect true environment always Ambient light information so that the combination of dummy object and true environment is even closer, natural.
Fig. 5 D schematically show the block diagram according to rendering module of the present invention.
As shown in Figure 5 D, rendering module 430 includes determination unit 541 and rendering unit 542.Wherein it is determined that unit 541 is used According to by irradiation information, Lighting information of the dummy object in all directions is determined.Rendering unit 542 is used to believe based on illumination Breath, renders dummy object.
In one embodiment of the invention, Lighting information includes radiation direction, light intensity and/or light color.
By the embodiment of the present disclosure, due to using according to by irradiation information, determining illumination of the dummy object in all directions The technical solution of information so that it is rendered by the Lighting information being radiated in all directions of dummy object, reach with The effect of true environment fusion, enhances the sense of reality of dummy object.
In one embodiment of the invention, according to by irradiation information, illumination letter of the dummy object in all directions is determined Breath includes:According to by irradiation information, Lighting information of the dummy object in all directions is determined using PBR algorithms.
It can due to determining Lighting information of the dummy object in all directions using PBR algorithms by the embodiment of the present disclosure To assign dummy object close to actual physical texture so that dummy object can under the irradiation of the ambient light of true environment, with The object of true environment is consistent, and enhances the sense of reality.
In one embodiment of the invention, for the dummy object of translucent appearance, dummy object is in all directions Light intensity includes the light intensity that scattering and transmission are calculated based on Subsurface Scattering principle.
In one embodiment of the invention, the dummy object of translucent appearance includes:Jade, candle, milk or skin.
By the embodiment of the present disclosure, since double of transparent virtual object uses different rendering intents, it is contemplated that virtual object The difference of body appearance material, and the sense of reality of dummy object is done according to the appearance material of dummy object and is changed accordingly Become so that the rendering intent scope of application of dummy object is wider.
In one embodiment of the invention, the posture of mobile terminal is obtained by the Inertial Measurement Unit of mobile terminal Information includes:Using 9 number of axle of the Inertial Measurement Unit of mobile terminal the posture letter of mobile terminal is obtained according to quaternary number is determined Breath.
By the embodiment of the present disclosure, since 9 number of axle using IMU obtain the posture letter of mobile terminal according to quaternary number is determined The technical solution of breath so that the result of calculation of posture information is accurate, the true posture of reflection mobile terminal that can be objective and accurate Information.
Exemplary media
An embodiment of the present invention provides a kind of media, are stored with computer executable instructions, and the instruction is by the processing list For realizing the dummy object rendering intent for mobile terminal of any one of this method embodiment when member executes.
In some possible embodiments, various aspects of the invention are also implemented as a kind of shape of program product Formula comprising program code, when program product is run on the terminal device, program code is for making terminal device execute this theory Described in bright above-mentioned " illustrative methods " part of book according to the present invention various illustrative embodiments for mobile terminal Step in dummy object rendering intent passes through movement for example, terminal device can execute operation S210 as shown in Figure 2 The camera of terminal obtains the ambient light information of true environment.S220 is operated, ambient light information is based on, determines dummy object in quilt Be subject at the position being inserted into true environment by irradiation by irradiation information.S230 is operated, is based on by irradiation information, it is right Dummy object is rendered.
The arbitrary combination of one or more readable mediums may be used in program product.Readable medium can be that readable signal is situated between Matter or readable storage medium storing program for executing.Readable storage medium storing program for executing for example may be-but not limited to-electricity, magnetic, optical, electromagnetic, infrared The system of line or semiconductor, system or device, or the arbitrary above combination.The more specific example of readable storage medium storing program for executing is (non- Exhaustive list) include:Electrical connection, portable disc, hard disk, random access memory (RAM) with one or more conducting wires, Read-only memory (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, the read-only storage of portable compact disc Device (CD-ROM), light storage device, magnetic memory device or above-mentioned any appropriate combination.
As shown in fig. 6, describing the journey that the dummy object for mobile terminal according to the embodiment of the present invention renders Sequence product 60 may be used portable compact disc read only memory (CD-ROM) and include program code, and can be in terminal It is run in equipment, such as PC.However, the program product of the present invention is without being limited thereto, and in this document, readable storage medium storing program for executing Can to be any include or the tangible medium of storage program, the program can be commanded execution system, system or device and use Or it is in connection.
Readable signal medium may include in a base band or as the data-signal that a carrier wave part is propagated, wherein carrying Readable program code.Diversified forms may be used in the data-signal of this propagation, including --- but being not limited to --- electromagnetism letter Number, optical signal or above-mentioned any appropriate combination.Readable signal medium can also be other than readable storage medium storing program for executing it is any can Read medium, which can send, propagate either transmission for being used by instruction execution system, system or device or Program in connection.
The program code for including on readable medium can transmit with any suitable medium, including --- but being not limited to --- Wirelessly, wired, optical cable, RF etc. or above-mentioned any appropriate combination.
It can be write with any combination of one or more programming languages for executing the program that operates of the present invention Code, programming language include object oriented program language --- and such as Java, C++ etc. further include conventional mistake Formula programming language --- such as " C ", language or similar programming language.Program code can be fully in user It executes on computing device, partly execute on a user device, being executed as an independent software package, partly in user's calculating Upper side point is executed or is executed in remote computing device or server completely on a remote computing.It is being related to far In the situation of journey computing device, remote computing device can pass through the network of any kind --- including LAN (LAN) or extensively Domain net (WAN) one is connected to user calculating equipment, or, it may be connected to external computing device (such as utilize Internet service Provider is connected by internet).
Exemplary computer device
After the method, system and medium for describing exemplary embodiment of the invention, next, being introduced with reference to figure 7 The computing device that the dummy object for mobile terminal of an illustrative embodiments according to the present invention renders.
The embodiment of the present invention additionally provides a kind of computing device.The computing device includes:Processing unit;And storage is single Member is stored with computer executable instructions, the instruction when being executed by the processing unit for realizing this method embodiment in appoint One dummy object rendering intent for mobile terminal.
Person of ordinary skill in the field it is understood that various aspects of the invention can be implemented as system, method or Program product.Therefore, various aspects of the invention can be embodied in the following forms, i.e.,:It is complete hardware embodiment, complete The embodiment combined in terms of full Software Implementation (including firmware, microcode etc.) or hardware and software, can unite here Referred to as circuit, " module " or " system ".
In some possible embodiments, it is single can to include at least at least one processing for computing device according to the present invention Member and at least one storage unit.Wherein, storage unit has program stored therein code, when program code is executed by processing unit When so that processing unit execute described in above-mentioned " illustrative methods " part of this specification according to the various exemplary realities of the present invention Apply the step in the information demonstrating method of mode.For example, processing unit can execute operation S210 as shown in Figure 2, pass through The camera of mobile terminal obtains the ambient light information of true environment.S220 is operated, ambient light information is based on, determines dummy object Be subject at the position in being inserted into true environment by irradiation by irradiation information.S230 is operated, based on being believed by irradiation Breath, renders dummy object.
The dummy object for mobile terminal that this embodiment according to the present invention is described referring to Fig. 7 renders The computing device 70 of method.Computing device 70 as shown in Figure 7 is only an example, should not be to the function of the embodiment of the present invention Any restrictions are brought with use scope.
As shown in fig. 7, computing device 70 is showed in the form of universal computing device.The component of computing device 70 may include But it is not limited to:Above-mentioned at least one processing unit 701, above-mentioned at least one storage unit 702, connection different system component (packet Include storage unit 702 and processing unit 701) bus 703.
Bus 703 indicates one or more in a few class bus structures, including memory bus or Memory Controller, Peripheral bus, graphics acceleration port, processor or the local bus using the arbitrary bus structures in a variety of bus structures.
Storage unit 702 may include the readable medium of form of volatile memory, such as random access memory (RAM) 7021 and/or cache memory 7022, it can further include read-only memory (ROM) 7023.
Storage unit 702 can also include program/utility with one group of (at least one) program module 7024 7025, such program module 7024 includes but not limited to:Operating system, one or more application program, other program moulds Block and program data may include the realization of network environment in each or certain combination in these examples.
Computing device 70 can also be with one or more external equipments 704 (such as keyboard, sensing equipment, bluetooth equipment etc.) Communication can also enable a user to the equipment interacted with computing device 70 communication with one or more, and/or be set with to calculate The standby 70 any equipment (such as router, modem etc.) that can be communicated with one or more of the other computing device are led to Letter.This communication can be carried out by input/output (I/O) interface 705.Also, computing device 70 can also be suitable by network Orchestration 706 and one or more network (such as LAN (LAN), wide area network (WAN) and/or public network, such as because of spy Net) communication.As shown, network adapter 706 is communicated by bus 703 with other modules of computing device 70.It should be understood that Although not shown in the drawings, other hardware and/or software module can be used in conjunction with computing device 70, including but not limited to:Micro- generation Code, device driver, redundant processing unit, external disk drive array, RAID system, tape drive and data backup are deposited Storage system etc..
It should be noted that although being referred to several units/modules or subelement/module of system in above-detailed, But it is this division be only exemplary it is not enforceable.In fact, according to the embodiment of the present invention, it is above-described The feature and function of two or more units/modules can embody in a units/modules.Conversely, above-described one The feature and function of a units/modules can be further divided into be embodied by multiple units/modules.
In addition, although the operation of the method for the present invention is described with particular order in the accompanying drawings, this do not require that or Hint must execute these operations according to the particular order, or have to carry out shown in whole operation could realize it is desired As a result.Additionally or alternatively, it is convenient to omit multiple steps are merged into a step and executed by certain steps, and/or by one Step is decomposed into execution of multiple steps.
Although by reference to several spirit and principle that detailed description of the preferred embodimentsthe present invention has been described, it should be appreciated that, this It is not limited to the specific embodiments disclosed for invention, does not also mean that the feature in these aspects cannot to the division of various aspects Combination is this to divide the convenience merely to statement to be benefited.The present invention is directed to cover appended claims spirit and Included various modifications and equivalent arrangements in range.

Claims (10)

1. a kind of dummy object rendering intent for mobile terminal, including:
The ambient light information of true environment is obtained by the camera of the mobile terminal;
Based on the ambient light information, determine that dummy object is subject at the position in being inserted into the true environment by Irradiation by irradiation information;And
Based on described by irradiation information, the dummy object is rendered.
2. dummy object rendering intent according to claim 1, wherein the method further includes:
The posture information of mobile terminal is obtained by the Inertial Measurement Unit of the mobile terminal.
3. dummy object rendering intent according to claim 1, wherein obtained by the camera of the mobile terminal true The ambient light information of real environment includes:
Multiple dynamic scene figures are obtained by the camera of the mobile terminal shoots the true environment;And
The multiple dynamic scene figure is matched, the environment ball of the true environment is obtained.
4. dummy object rendering intent according to claim 2, wherein the method further includes:
The ambient light information is updated according to the posture information of the mobile terminal.
5. dummy object rendering intent according to claim 1, wherein based on described by irradiation information, to described virtual Object render:
According to described by irradiation information, Lighting information of the dummy object in all directions is determined;And
Based on the Lighting information, the dummy object is rendered.
6. dummy object rendering intent according to claim 5, wherein according to described by irradiation information, determine the void Intending Lighting information of the object in all directions includes:
According to described by irradiation information, Lighting information of the dummy object in all directions is determined using PBR algorithms.
7. dummy object rendering intent according to claim 5, wherein the Lighting information includes radiation direction, light Intensity and/or light color.
8. a kind of dummy object rendering system for mobile terminal, including:
First acquisition module, the ambient light information for obtaining true environment by the camera of the mobile terminal;
Determining module determines position of the dummy object in being inserted into the true environment for being based on the ambient light information Set that place is subject to by irradiation by irradiation information;And
Rendering module, for, by irradiation information, being rendered to the dummy object based on described.
9. a kind of medium is stored with computer executable instructions, described instruction by processing unit when being executed for realizing right It is required that the dummy object rendering intent described in any one of 1 to 7.
10. a kind of computing device, including:
Processing unit;And
Storage unit, is stored with computer executable instructions, and described instruction is wanted when being executed by processing unit for realizing right Seek the dummy object rendering intent described in any one of 1 to 7.
CN201810131051.6A 2018-02-08 2018-02-08 Dummy object rendering intent, system, medium and computing device Pending CN108305328A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810131051.6A CN108305328A (en) 2018-02-08 2018-02-08 Dummy object rendering intent, system, medium and computing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810131051.6A CN108305328A (en) 2018-02-08 2018-02-08 Dummy object rendering intent, system, medium and computing device

Publications (1)

Publication Number Publication Date
CN108305328A true CN108305328A (en) 2018-07-20

Family

ID=62864850

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810131051.6A Pending CN108305328A (en) 2018-02-08 2018-02-08 Dummy object rendering intent, system, medium and computing device

Country Status (1)

Country Link
CN (1) CN108305328A (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109166170A (en) * 2018-08-21 2019-01-08 百度在线网络技术(北京)有限公司 Method and apparatus for rendering augmented reality scene
CN109240505A (en) * 2018-09-27 2019-01-18 湖南航天捷诚电子装备有限责任公司 It wears display equipment and wears the outdoor visual method of display equipment using this
CN109712226A (en) * 2018-12-10 2019-05-03 网易(杭州)网络有限公司 The see-through model rendering method and device of virtual reality
CN109754452A (en) * 2018-12-28 2019-05-14 北京达佳互联信息技术有限公司 Processing method, device, electronic equipment and the storage medium of image rendering
CN110310224A (en) * 2019-07-04 2019-10-08 北京字节跳动网络技术有限公司 Light efficiency rendering method and device
CN110503711A (en) * 2019-08-22 2019-11-26 三星电子(中国)研发中心 The method and device of dummy object is rendered in augmented reality
CN110866966A (en) * 2018-08-27 2020-03-06 苹果公司 Rendering virtual objects with realistic surface properties matching the environment
CN111127624A (en) * 2019-12-27 2020-05-08 珠海金山网络游戏科技有限公司 Illumination rendering method and device based on AR scene
CN111260769A (en) * 2020-01-09 2020-06-09 北京中科深智科技有限公司 Real-time rendering method and device based on dynamic illumination change
CN111339112A (en) * 2020-02-27 2020-06-26 武汉科技大学 Building material informatization management and control method and system based on building information model
CN111833423A (en) * 2020-06-30 2020-10-27 北京市商汤科技开发有限公司 Presentation method, presentation device, presentation equipment and computer-readable storage medium
CN111862344A (en) * 2020-07-17 2020-10-30 北京字节跳动网络技术有限公司 Image processing method, apparatus and storage medium
CN112530219A (en) * 2020-12-14 2021-03-19 北京高途云集教育科技有限公司 Teaching information display method and device, computer equipment and storage medium
CN112884873A (en) * 2021-03-12 2021-06-01 腾讯科技(深圳)有限公司 Rendering method, device, equipment and medium for virtual object in virtual environment
CN113379884A (en) * 2021-07-05 2021-09-10 北京百度网讯科技有限公司 Map rendering method and device, electronic equipment, storage medium and vehicle
CN113610955A (en) * 2021-08-11 2021-11-05 北京果仁互动科技有限公司 Object rendering method and device and shader
EP3970818A1 (en) * 2020-09-18 2022-03-23 XRSpace CO., LTD. Method for adjusting skin tone of avatar and avatar skin tone adjusting system
WO2023142035A1 (en) * 2022-01-29 2023-08-03 华为技术有限公司 Virtual image processing method and apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103679204A (en) * 2013-12-23 2014-03-26 上海安琪艾可网络科技有限公司 Image identification and creation application system and method based on intelligent mobile device platform
WO2015142446A1 (en) * 2014-03-17 2015-09-24 Qualcomm Incorporated Augmented reality lighting with dynamic geometry
CN105144245A (en) * 2013-04-24 2015-12-09 高通股份有限公司 Apparatus and method for radiance transfer sampling for augmented reality

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105144245A (en) * 2013-04-24 2015-12-09 高通股份有限公司 Apparatus and method for radiance transfer sampling for augmented reality
CN103679204A (en) * 2013-12-23 2014-03-26 上海安琪艾可网络科技有限公司 Image identification and creation application system and method based on intelligent mobile device platform
WO2015142446A1 (en) * 2014-03-17 2015-09-24 Qualcomm Incorporated Augmented reality lighting with dynamic geometry

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
JEAN-FRANCOIS LALONDE等: "《Webcam Clip Art: Appearance and Illuminant Transfer from Time-lapse Sequences》", 《ACM TRANSACTIONS ON GRAPHICS》 *
PAUL DEBEVEC: "《Rendering Synthetic Objects into Real Scenes: Bridging Traditional and Image-based Graphics with Global Illumination and High Dynamic Range Photography》", 《SIGGRAPH98 CONFERENCE》 *
王珂: "《增强现实中虚实光照一致性研究综述》", 《光电技术应用》 *
覃云: "《ARCore ——移动 AR 的浪潮》", 《HTTPS://WWW.INFOQ.CN/ARTICLE/2017/09/ARCORE-MOBLIE-AR/》 *

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109166170A (en) * 2018-08-21 2019-01-08 百度在线网络技术(北京)有限公司 Method and apparatus for rendering augmented reality scene
CN110866966A (en) * 2018-08-27 2020-03-06 苹果公司 Rendering virtual objects with realistic surface properties matching the environment
CN110866966B (en) * 2018-08-27 2023-12-15 苹果公司 Rendering virtual objects with realistic surface properties that match an environment
CN109240505A (en) * 2018-09-27 2019-01-18 湖南航天捷诚电子装备有限责任公司 It wears display equipment and wears the outdoor visual method of display equipment using this
CN109712226A (en) * 2018-12-10 2019-05-03 网易(杭州)网络有限公司 The see-through model rendering method and device of virtual reality
CN109754452B (en) * 2018-12-28 2022-12-27 北京达佳互联信息技术有限公司 Image rendering processing method and device, electronic equipment and storage medium
CN109754452A (en) * 2018-12-28 2019-05-14 北京达佳互联信息技术有限公司 Processing method, device, electronic equipment and the storage medium of image rendering
CN110310224A (en) * 2019-07-04 2019-10-08 北京字节跳动网络技术有限公司 Light efficiency rendering method and device
CN110503711A (en) * 2019-08-22 2019-11-26 三星电子(中国)研发中心 The method and device of dummy object is rendered in augmented reality
CN110503711B (en) * 2019-08-22 2023-02-21 三星电子(中国)研发中心 Method and device for rendering virtual object in augmented reality
CN111127624A (en) * 2019-12-27 2020-05-08 珠海金山网络游戏科技有限公司 Illumination rendering method and device based on AR scene
CN111260769A (en) * 2020-01-09 2020-06-09 北京中科深智科技有限公司 Real-time rendering method and device based on dynamic illumination change
CN111260769B (en) * 2020-01-09 2021-04-13 北京中科深智科技有限公司 Real-time rendering method and device based on dynamic illumination change
CN111339112A (en) * 2020-02-27 2020-06-26 武汉科技大学 Building material informatization management and control method and system based on building information model
CN111339112B (en) * 2020-02-27 2023-07-25 武汉科技大学 Building material informatization management and control method and system based on building information model
CN111833423A (en) * 2020-06-30 2020-10-27 北京市商汤科技开发有限公司 Presentation method, presentation device, presentation equipment and computer-readable storage medium
CN111862344A (en) * 2020-07-17 2020-10-30 北京字节跳动网络技术有限公司 Image processing method, apparatus and storage medium
CN111862344B (en) * 2020-07-17 2024-03-08 抖音视界有限公司 Image processing method, apparatus and storage medium
EP3970818A1 (en) * 2020-09-18 2022-03-23 XRSpace CO., LTD. Method for adjusting skin tone of avatar and avatar skin tone adjusting system
CN112530219A (en) * 2020-12-14 2021-03-19 北京高途云集教育科技有限公司 Teaching information display method and device, computer equipment and storage medium
CN112884873B (en) * 2021-03-12 2023-05-23 腾讯科技(深圳)有限公司 Method, device, equipment and medium for rendering virtual object in virtual environment
CN112884873A (en) * 2021-03-12 2021-06-01 腾讯科技(深圳)有限公司 Rendering method, device, equipment and medium for virtual object in virtual environment
CN113379884A (en) * 2021-07-05 2021-09-10 北京百度网讯科技有限公司 Map rendering method and device, electronic equipment, storage medium and vehicle
CN113379884B (en) * 2021-07-05 2023-11-17 北京百度网讯科技有限公司 Map rendering method, map rendering device, electronic device, storage medium and vehicle
CN113610955A (en) * 2021-08-11 2021-11-05 北京果仁互动科技有限公司 Object rendering method and device and shader
WO2023142035A1 (en) * 2022-01-29 2023-08-03 华为技术有限公司 Virtual image processing method and apparatus

Similar Documents

Publication Publication Date Title
CN108305328A (en) Dummy object rendering intent, system, medium and computing device
US20220301269A1 (en) Utilizing topological maps for augmented or virtual reality
US11694392B2 (en) Environment synthesis for lighting an object
Billinghurst et al. Designing augmented reality interfaces
EP2147412B1 (en) 3d object scanning using video camera and tv monitor
Piekarski Interactive 3d modelling in outdoor augmented reality worlds
CN105190703A (en) Using photometric stereo for 3D environment modeling
CN116091676B (en) Face rendering method of virtual object and training method of point cloud feature extraction model
CN108010118A (en) Virtual objects processing method, virtual objects processing unit, medium and computing device
US11195323B2 (en) Managing multi-modal rendering of application content
CN108346178A (en) Mixed reality object is presented
CN107403461A (en) It is distributed using directional scatter metaplasia into stochastical sampling
CN110458924B (en) Three-dimensional face model establishing method and device and electronic equipment
CN115965727A (en) Image rendering method, device, equipment and medium
JP2022034560A (en) Method and system for displaying large 3d model on remote device
CN116228943B (en) Virtual object face reconstruction method, face reconstruction network training method and device
CN115375810A (en) Transforming the discrete light attenuation into spectral data for rendering an object volume
JP5848071B2 (en) A method for estimating the scattering of light in a homogeneous medium.
US20240015263A1 (en) Methods and apparatus to provide remote telepresence communication
Viveiros Augmented reality and its aspects: a case study for heating systems
Shen THE PURDUE UNIVERSITY GRADUATE SCHOOL STATEMENT OF DISSERTATION APPROVAL
Beebe A Bibliography of Publications in IEEE Computer Graphics and Applications
Егизбаев et al. AN OVERVIEW OF AUGMENTED REALITY AND ITS APPLICATION IN THE FIELD OF INTERIOR DESIGN
Kolokouris Interactive presentation of cultural content using hybrid technologies, geographical systems and three-dimensional technologies.
TW202347261A (en) Stereoscopic features in virtual reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20190618

Address after: 311200 Room 102, 6 Blocks, C District, Qianjiang Century Park, Xiaoshan District, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou Yixian Advanced Technology Co.,Ltd.

Address before: 310052 Building No. 599, Changhe Street Network Business Road, Binjiang District, Hangzhou City, Zhejiang Province, 4, 7 stories

Applicant before: NETEASE (HANGZHOU) NETWORK Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180720