CN110070621A - Electronic device, the method and computer readable media for showing augmented reality scene - Google Patents

Electronic device, the method and computer readable media for showing augmented reality scene Download PDF

Info

Publication number
CN110070621A
CN110070621A CN201910049440.9A CN201910049440A CN110070621A CN 110070621 A CN110070621 A CN 110070621A CN 201910049440 A CN201910049440 A CN 201910049440A CN 110070621 A CN110070621 A CN 110070621A
Authority
CN
China
Prior art keywords
augmented reality
dummy object
image
images
environmental images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910049440.9A
Other languages
Chinese (zh)
Other versions
CN110070621B (en
Inventor
吴昱霆
陈镜阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HTC Corp
Original Assignee
High Tech Computer Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by High Tech Computer Corp filed Critical High Tech Computer Corp
Publication of CN110070621A publication Critical patent/CN110070621A/en
Application granted granted Critical
Publication of CN110070621B publication Critical patent/CN110070621B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/60Shadow generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/12Shadow map, environment map
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

A kind of electronic device shows the method for augmented reality scene and non-transient computer readable media, electronic device include: camera unit, display and processor.Processor is electrically connected to camera unit and display.Multiple images of the camera unit to shoot physical environment.Display is to show augmented reality image.Processor is to splice image with generation environment image, environment light and the reflection of the corresponding physical environment of dummy object are calculated according to environmental images, dummy object is rendered on augmented reality image to calculate the corresponding shade of dummy object, and according to environment light, reflection and shade according to the direction that environmental images analyze light source.Therefore, head-mounted display apparatus can show the technical effect of dummy object in augmented reality scene.

Description

Electronic device, the method and computer readable media for showing augmented reality scene
Technical field
This disclosure relates to a kind of electronic device, the method for showing augmented reality scene and non-transient readable in computer matchmaker Body, and in particular to a kind of electronic device for showing dummy object in augmented reality scene, method and non-transient electricity Media can be read in brain.
Background technique
In augmented reality (augmented reality, AR) and virtual reality (virtual reality, VR) system In, if dummy object is mixed into the environment of real world, need to make the light source configuration of dummy object and real generation The light source configuration on boundary is as similar as possible.Due to Efficacy Problem, the existing usual portion of real time environment light technology can simulate shade.So And the shade of dummy object can provide depth information for viewer, and greatly improve rendering quality.Therefore, it is needed in this field Want a kind of electronic device that the light source configuration of real world environments is taken and mixes real border or virtual reality system.
Summary of the invention
According to the first embodiment of present disclosure, disclose a kind of electronic device, electronic device comprising camera unit, Display and processor.Multiple images of the camera unit to shoot physical environment.Display is to show augmented reality shadow Picture, wherein the content of augmented reality image includes dummy object and the object corresponding to physical environment.Processor is electrically connected To camera unit and display.Processor calculates virtual object according to environmental images to splice image with generation environment image It is corresponding to calculate dummy object to analyze the direction of light source according to environmental images for the environment light of the corresponding physical environment of body and reflection Shade, and dummy object is rendered on augmented reality image according to environment light, reflection and shade.
According to one embodiment of the disclosure, the processor is also to execute following steps: calculate the environmental images one is total bright Degree;The environmental images are distinguished into multiple regions according to the total brightness, wherein the two degrees in those regions are equal to each other;According to each The size in region sorts those regions to generate a ranking results;One of those regions are selected according to the ranking results, And at least one party for corresponding to choice area domain is generated to light;It is counted according to the position in choice area domain and the position of the dummy object It calculates at least one party to the corresponding light source vector of light;And the corresponding yin of the dummy object is generated according to the light source vector Shadow.
According to the second embodiment of present disclosure, a kind of method showing augmented reality scene is disclosed, and display increases The method of strong reality scene includes: shooting multiple images in physical environment by camera unit;Splice image by processor With generation environment image;The environment light of the corresponding physical environment of dummy object and anti-is calculated according to environmental images by processor It penetrates;The direction of light source is analyzed to calculate the corresponding shade of dummy object according to environmental images by processor;Pass through processor root Dummy object is rendered on augmented reality image according to environment light, reflection and shade;And augmented reality is shown by display Image, wherein the content of augmented reality image includes dummy object and the object corresponding to physical environment.
According to one embodiment of the disclosure, when a viewpoint with it is multiple match the matching of one of target when, the camera unit To shoot one of those images, wherein the viewpoint is as the electronic device is mobile and rotation, and those matching mesh It is marked on around the viewpoint.
According to one embodiment of the disclosure, the processor is to receive those images from the camera unit, wherein each Image includes a position of the corresponding camera unit.
According to one embodiment of the disclosure, device splices those images to generate the environmental images through this process, also includes: logical Cross the quality that the processor calculates current image according to the distance between a current location and an initial position;And If the quality of current image is greater than a threshold value, by current image joint into the environmental images;Wherein, those shadows The initial position of one of picture for an initial image and the initial image comprising the corresponding camera unit.
According to one embodiment of the disclosure, device calculates the corresponding object of the dummy object according to the environmental images through this process The environment light for managing environment also includes: device is calculated according to the environmental images using a spheric harmonic function virtual with this through this process The corresponding environment light of object.
According to one embodiment of the disclosure, device calculates the corresponding object of the dummy object according to the environmental images through this process The reflection for managing environment also includes: device generates a cube graph according to the environmental images through this process, and according to this cube Body figure calculates the reflection corresponding with the dummy object.
According to one embodiment of the disclosure, device analyzes the direction of the light source according to the environmental images to calculate this through this process The corresponding shade of dummy object also includes: calculating a total brightness of the environmental images;According to the total brightness by the environmental images Multiple regions are distinguished into, wherein the two degrees in those regions are equal to each other;Those regions are sorted according to the size in each region to produce A raw ranking results;One of those regions are selected according to the ranking results, and generates and corresponds to choice area domain extremely Few direction light;It is calculated at least one party according to the position in choice area domain and the position of the dummy object to light corresponding one Light source vector;And the corresponding shade of the dummy object is generated according to the light source vector.
According to the third embodiment of present disclosure, a kind of non-transient computer readable media, non-transient electricity are disclosed It includes an at least instruction repertorie that media, which can be read, in brain, executes an at least instruction repertorie by processor to carry out display augmented reality field The method of scape, it includes: multiple images in physical environment are shot by camera unit;Splice image by processor to generate Environmental images;Environment light and the reflection of the corresponding physical environment of dummy object are calculated according to environmental images by processor;It is logical It crosses processor and analyzes the direction of light source according to environmental images to calculate the corresponding shade of dummy object;By processor according to environment Light, reflection and shade render dummy object on augmented reality image;And augmented reality image is shown by display, The content of middle augmented reality image includes dummy object and the object corresponding to physical environment.
Detailed description of the invention
For above and other purpose, feature, advantage and embodiment of the invention can be clearer and more comprehensible, Figure of description It is described as follows:
Fig. 1 is the block diagram of the electronic device according to shown by some embodiments of the present disclosure;
Fig. 2 is the flow chart that the method for augmented reality scene is shown according to shown by some embodiments of the present disclosure;
Fig. 3 A is the schematic diagram of the augmented reality scene according to shown by some embodiments of the present disclosure;
Fig. 3 B is the schematic diagram of matching target and initial viewpoint according to shown by some embodiments of the present disclosure;
Fig. 3 C is the schematic diagram of the environmental images according to shown by some embodiments of the present disclosure;
Fig. 3 D is the schematic diagram of the environmental images according to shown by some embodiments of the present disclosure;
Fig. 4 is the schematic diagram in the visual field of electronic device in the physical environment according to shown by some embodiments of the present disclosure;
Fig. 5 A is the schematic diagram of the dummy object according to shown by some embodiments of the present disclosure;
Fig. 5 B is the schematic diagram for rendering dummy object in physical environment according to shown by some embodiments of the present disclosure;
Fig. 6 is the flow chart of the step S240 according to shown by some embodiments of the present disclosure;
Fig. 7 A is the schematic diagram in the region of the environmental images according to shown by some embodiments of the present disclosure;
Fig. 7 B is the schematic diagram in the region of the environmental images according to shown by some embodiments of the present disclosure;
Fig. 8 is the light source according to shown by some embodiments of the present disclosure as the bright schematic diagram in physical environment;And
Fig. 9 is the schematic diagram for rendering dummy object in physical environment according to shown by some embodiments of the present disclosure.
Description of symbols:
100: electronic device
110: camera unit
130: processor
150: display
200: the method for display augmented reality scene
SCN: augmented reality scene
MT: matching target
VP, VP ': viewpoint
Img, Img1, Img2: image
EI: environmental images
PE: physical environment
R, R1, R2, R3, R4, R5, R6, A1, A2, A3, A4: region
VO: dummy object
PO1, PO2, PO3: object
L: lamp
S1, S2, S3: shade
SL1, SL2, SL3: separator bar
S210~S260, S241~S246: step
Specific embodiment
Following discloses provide many different embodiments or illustration to implement different characteristic of the invention.In special illustration Element and configuration are used to simplify the disclosure in the following discussion.The purposes that any illustration discussed only is used to narrate, and It will not limit the invention in any way or the range and meaning of its illustration.In addition, the disclosure may repeat in different illustrations Numerical chracter and/or letter are quoted, these are repeated all in order to simplify and illustrate, different real in itself and not specified following discussion Apply the relationship between example and/or configuration.
Please refer to Fig. 1.Fig. 1 is the block diagram of the electronic device 100 according to shown by some embodiments of the present disclosure.Such as figure Shown by 1, electronic device 100 includes camera unit 110, processor 130 and display 150, and processor 130 is electrically connected to Camera unit 110 and display 150.In some embodiments, electronic device 100 can be augmented reality (AR) system of loading Or mix a device of real border (MR) system, such as wear-type device (head-mounted device, HMD).Processor 130 It may be embodied as graphics processor and/or central processing unit, when user's wearable electronics 100, display 150 will The ken of user can be covered, and display 150 is used to show to user and mixes real border scene or augmented reality scene.
Processor 130 is coupled to display 150.Processor 130 is to handle the increasing that will be shown on the display 150 Image/video data of strong reality (or mixing real border) scene.Augmented reality/mixing reality border scene may include dummy object (example Such as, teapot, statue, glass etc.) and/or physical environment feature (for example, wall, door, desk, window etc.).
Please refer to Fig. 2.Fig. 2 is the method that augmented reality scene is shown according to shown by some embodiments of the present disclosure 200 flow chart.In an embodiment, the method 200 of display augmented reality scene can be used to execute the meter of rendering dummy object It calculates, and by dummy object and the combination of physical environment feature to form augmented reality image.In other words, augmented reality image is interior Container has dummy object and the object corresponding to physical environment.Therefore, augmented reality scene is by multiple augmented reality images It is formed.
Please also refer to Fig. 1 and Fig. 2, the method 200 of embodiment as shown in Figure 2, display augmented reality scene is held first Row step S210 shoots multiple images in physical environment by camera unit 110.Fig. 3 A is please referred to, Fig. 3 A is according to the disclosure Some embodiments shown by augmented reality scene SCN schematic diagram.In this embodiment, augmented reality scene SCN can be with Comprising having the image data Img of the spherical augmented reality scene SCN around the user for adorning oneself with electronic device 100.This public affairs It opens and is not limited to spherical augmented reality scene.In another embodiment, augmented reality scene SCN be may include with hemispherical Augmented reality scene, circular ring shape augmented reality scene or other etc. similar shapes image data Img.
Then, show that the method 200 of augmented reality scene splices image by processor 130 with generation environment image.Please Refering to Fig. 3 B, Fig. 3 B is the signal of matching target MT and initial viewpoint VP according to shown by some embodiments of the present disclosure Figure.In this embodiment, environmental images can be understood as the day sylphon (skybox) of physical environment.As shown in Figure 3B, work as viewpoint VP with it is multiple match one of target MT matching when, the camera unit 110 of electronic device 100 is to shoot augmented reality field The image Img of scape SCN.It is worth noting that each image that camera unit 110 is shot includes the position of corresponding camera unit 110 It sets.
Then, as shown in Figure 3B, vision point P is as electronic device 100 is mobile and rotates, and matches target MP in viewpoint Around VP.When vision point P is matched to matching one of target MT, camera unit 110 to shoot initial image Img1, And the coordinate of the initial position of camera unit 110 is Ip (0,0,0).
In an embodiment, it can use auxiliary mark and carry out assisting user alignment matching target MT, auxiliary mark can be with It is embodied as the shape identical or different with target MT is matched, the quantity of matching target MT may be embodied as being greater than 12, auxiliary mark Quantity and shape be not limited to this.
Fig. 3 C is please referred to, Fig. 3 C is the schematic diagram of the environmental images EI according to shown by some embodiments of the present disclosure.Such as Shown in Fig. 3 C, processor 130 is splicing initial image Img1 into environmental images EI.When initial image Img1 has been spelled It is connected to environmental images EI, vision point P ' is moved to and rotates to new position and match another matching target MT, camera unit 110 To shoot another image Img2.
Then, processor 130 is also to calculate current image according to the distance between current location and initial position Quality.Current location corresponds to the position of the camera unit 110 of filmed image Img2, and the quality of current image Img2 can be with It is learnt by the calculating of formula 1.For example, the coordinate of the initial position of camera unit 110 is Ip (0,0,0) and camera unit The coordinate of 110 current location is Cp (10,10,10).Therefore, the quality of current image Img2 can be obtained by formula 1, if Farther out, the quality of current image Img2 will be poor for the distance between current location and initial position, it can thus be appreciated that distance and It is inverse relation between quality.
Qp=Distance (Cp, Ip)-1(formula 1)
Fig. 3 D is please referred to, Fig. 3 D is the schematic diagram of the environmental images EI according to shown by some embodiments of the present disclosure.Such as Shown in Fig. 3 D, if the quality of current image Img2 is greater than threshold value, processor 130 is splicing current image Img2 to ring In the image EI of border, initial image Img1 and current image Img2 are overlapped in the R of region, and processor 130 is to more initial shadow As the quality of Img1 and the quality of current image Img2.In an embodiment, there is more low-quality image will be ignored, Overlapping region R will be updated to the image of higher quality.In the case, the quality of initial image Img1 is higher than current shadow As the quality of Img2, therefore, overlapping region R will be updated to the data of initial image Img1.
In another embodiment, in the R of overlapping region, it will be mixed with the image with higher quality with more low-quality image It closes.In the case, overlapping region R will be updated to initial image Img1 and the mixed data of current image Img2.It is mixed The ratio of conjunction is adjustable, for example, if initial image Img1 quality with higher, the mixing ratio of initial image Img1 The blending ratio that rate can be set as 0.8 and current image Img2 can be set as 0.2.However, the disclosure is not limited to This.
In this embodiment, as shown in Figure 3B, after vision point P is matched with all matching target MT, that is, it is appreciated that It has generated and has finished for environmental images EI.It is worth noting that physical environment is corresponded to by the image that camera unit 110 is shot, Therefore environmental images EI is a day sylphon for corresponding to physical environment.
Referring to Fig. 4, Fig. 4 is electronic device 100 in the physical environment PE according to shown by some embodiments of the present disclosure The visual field schematic diagram.In this embodiment, the camera unit 110 of electronic device 100 is to shoot the multiple of physical environment PE Image.PE has lamp L and three objects PO1, PO2 and PO3 in physical environment, and object PO1, PO2 and PO3 have difference Texture.
Fig. 5 A is please referred to, Fig. 5 A is the schematic diagram of the dummy object VO according to shown by some embodiments of the present disclosure.In In this embodiment, when processor 130 is to render to dummy object VO in physical environment PE, user can pass through electronics Device 100 sees dummy object VO in augmented reality scene SCN.Assuming that dummy object VO is the bottle with metal material, As shown in Figure 5A, the texture of dummy object VO is default texture (for example, netted).Due to the light source from lamp L, object PO1 tool There is shade S1 and object PO2 that there is shade S2.However, in this case, due to have the dummy object VO of default texture with Object around in physical environment PE mismatches, it will be seen that the dummy object VO presented by processor 130 is in physical rings It is in the PE of border and untrue.
Then, show that the method 200 of augmented reality scene executes step S230 by processor 130 according to environmental images EI Calculate environment light and the reflection of the corresponding physical environment PE of dummy object VO.In this embodiment, processor 130 is to basis Environmental images EI calculates environment light corresponding with dummy object VO using spheric harmonic function (spherical harmonics, SH).? In game engine (for example, Unity 3D, Unreal Engine etc.), it is (ambient occlusion, complete that spheric harmonic function can be used for indirect lighting Office's illumination etc.).Therefore, we can utilize spheric harmonic function build environment light according to ambient image EI.
Then, processor 130 is to generate cube graph (cube map) according to environmental images EI, and according to cube Figure calculates reflection corresponding with dummy object VO.In 3D computer graphics, cube graph generally comprises six texture mapping surfaces, Each cube face can be expressed as some environment details, such as the color or texture of illumination.Therefore, cube graph can be used for Ambient light effects true to nature are assigned to the dummy object rendered, there is dummy object in the background information of scene true to nature Appearance and impression.
Fig. 5 B is please referred to, Fig. 5 B is rendered virtually in physical environment PE according to shown by some embodiments of the present disclosure The schematic diagram of object VO.As described in top, in step S230, the corresponding physics of dummy object VO is calculated according to environmental images EI The environment light of environment PE and reflection.For example, as shown in Figure 5 B, the texture of object PO1 is with red plus sige shape line It manages (such as "+"), the texture of object PO2 is with coffee-like oblique line shape texture (such as "/"), and the texture of object PO3 is tool The texture of the dotted texture (such as " ") and dummy object VO that have blue is the reticular texture with grey.Processor 130 After executing the step S230, the reflection from object PO1 to dummy object VO, which has calculated, to be finished and on dummy object VO (for example, plus sige shape texture is shown in the A1 of region) is shown in the A1 of region.And so on, from object PO2 to dummy object VO's Reflection, which has calculated, to be finished and is shown in the region A2 on dummy object VO (for example, oblique line shape texture shows in the A2 of region Out).Reflection from object PO3 to dummy object VO, which has calculated, to be finished and shows (example in the region A3 on dummy object VO Such as, dotted texture is shown in the A3 of region).
As shown in Figure 5 B, the appearance of dummy object VO has its in physical environment PE after being shown as rendering at present The texture (oblique line shape texture, plus sige shape texture and dotted texture) of the color of his object.Furthermore on dummy object VO Shown in the A4 of region, the texture of dummy object VO is the default texture with grey.It should be appreciated that not being by from other objects The reflection of PO1~PO3 covers directly and completely, the surface (or appearance) of dummy object VO at present with the reticular texture of itself with And the reflection mixing of the texture from other objects PO1~PO3.However, showing region A1~A4 only in order to illustrate.It is practical On, the color of object PO1~PO3 and the primitive color of dummy object VO can be mixed with the wash with watercolours on the surface of dummy object VO Dye.It is worth noting that environmental images EI can to generate correspond to dummy object VO environment light and reflection, therefore The dummy object VO generated after step S230 can be truer compared with dummy object VO as shown in Figure 5A.
Then, show that the method 200 of augmented reality scene executes step S240 by processor 130 according to environmental images EI The direction of light source is analyzed to calculate the corresponding shade of dummy object VO.Step S240 also includes step S241~S246, please together It is the flow chart of the step S240 according to shown by some embodiments of the present disclosure with reference to Fig. 6, Fig. 6.Implementation as shown in FIG. 6 Example shows that the method 200 of augmented reality scene further executes the total brightness that step S241 calculates environmental images EI.In this implementation In example, processor 130 is summed up to calculate the grayscale value of environmental images EI, and for example, the total brightness of environmental images E is 20000。
Then, show that the method 200 of augmented reality scene further executes step S242 according to total brightness for environmental images EI is distinguished into multiple regions.Fig. 7 A is please referred to, Fig. 7 A is the environmental images EI according to shown by some embodiments of the present disclosure The schematic diagram in region.As shown in Figure 7 A, processor 130 to by separator bar SL1 by environmental images EI divide into region R1 and R2.The brightness of region R1 is equal to the brightness of region R2, this means that the brightness of region R1 and R2 are all 10000.
Then, Fig. 7 B is please referred to, Fig. 7 B is the region of the environmental images EI according to shown by some embodiments of the present disclosure Schematic diagram.As shown in Figure 7 B, processor 130 more to by separator bar SL2 by ring region R1 divide into region R3 and R4 with And ring region R2 is divided by region R5 and R6 by separator bar SL3.The brightness of region R3 is equal to the brightness of region R4, this is meaned The brightness of region R3 and R4 be all 5000.The brightness of region R5 is equal to the brightness of region R6, this means that region R5 and R6 Brightness is all 5000.The number that step S242 is executed can be set by the brightness in the quantity of separator bar or region, if step The number that rapid S242 is executed increases, and the brightness summation in region will reduce and the quantity in region will will increase.
Then, show that the method 200 of augmented reality scene further executes step S243 according to the big float in each region Sequence region is to generate ranking results.In embodiment above-mentioned, each region has similar brightness value (for example, 5000).Such as The size in fruit region is smaller, this region will become the candidate of light source.Therefore, it can be understood that ranking results are according to region Size (from full-size to minimum dimension) arrangement.In another embodiment, ranking results can also be by minimum ruler from row Sequence is to full-size, however the present disclosure is not limited thereto.
Then, show that the method 200 of augmented reality scene further executes step S244 according to ranking results selection region One of, and at least one party for corresponding to choice area domain is generated to light.Continue previous embodiment, if processor 130 To the region for selecting to have minimum dimension, the region selected will become light source and generate direction light.In another embodiment In, processor 130 can sequentially select multiple regions or the quantity in the region Xu Ze that can be determined by user according to ranking results It is fixed, however the means of selection region are without being limited thereto.
Then, show that the method 200 of augmented reality scene further executes step S245 according to the position in choice area domain And the position of dummy object VO is calculated at least one party to the corresponding light source vector of light.It is according to this please also refer to Fig. 8, Fig. 8 Light source shown by disclosed some embodiments is as the bright schematic diagram in physical environment PE.In example as shown in Figure 8, because by Selection region is a part of environmental images EI, therefore can obtain position of the choice area domain in environmental images EI.Herein In the case of, selected area is the region of lamp L.Then, because augmented reality scene SCN be as caused by environmental images EI, So can learn that (coordinate of z-axis can be by preset value, multi-angle of view for the position of the lamp L in spherical augmented reality scene SCN Sensor or depth sense device obtain).For similar reason, it can learn virtual in spherical augmented reality scene SCN The position of object VO, the coordinate of lamp L is (x1, y1, z1) and the coordinate of dummy object VO is (x2, y2, z2).It therefore, can be with Light source vector is calculated by the coordinate of the coordinate of lamp L and dummy object VO.
Then, show that the method 200 of augmented reality scene further executes step S246 and generates virtually according to light source vector The corresponding shade of object VO refers to Fig. 9 together, Fig. 9 be according to shown by some embodiments of the present disclosure in physical environment PE Render the schematic diagram of dummy object VO.As shown in figure 9, yin of the processor 130 to generate dummy object VO according to light source vector Shadow S3.Therefore, because the light source from lamp L, dummy object VO has shade S3.
Then, show that the method 200 of augmented reality scene further executes step S250 by processor 130 according to environment Light, reflection and shade render dummy object VO on augmented reality image.According to previous embodiment, generated after step S240 Dummy object VO can be truer compared with dummy object VO as shown in Figure 5 B.Show augmented reality scene method 200 into One step executes step S260 and shows augmented reality image by display 130.When processor 130 is to render dummy object VO, Display 150 is to show augmented reality scene SCN.When processor 130 in physical environment PE according to environment light, reflection and Shade is come when rendering dummy object VO, dummy object VO can be with the object (for example, object PO1, PO2 and PO3) of real world It is more consistent.
Another embodiment discloses a kind of non-transient computer readable media, non-transient computer readable media store instruction journey Sequence is to execute the method 200 for showing augmented reality scene as shown in Figure 2.
According to embodiment above-mentioned, electronic device, the method for augmented reality scene and non-transient computer readable media The environmental images that physical environment can be generated calculate environment light, reflection and shade according to environmental images, and according to environment Light, reflection and shade render dummy object in augmented reality scene.In some embodiments, head-mounted display apparatus energy It is enough to show dummy object in augmented reality scene.
In addition, above-mentioned illustration includes example steps sequentially, but those steps need not be sequentially executed according to shown.With Different order executes those steps all considering in range in present disclosure.Design and model in the embodiment of present disclosure In enclosing, it can optionally increase, replace, change sequence and/or omitting those steps.
Although the disclosure is disclosed as above with embodiment, so it is not limited to the disclosure, any art technology Personnel, in the conception and scope for not departing from the disclosure, when various variation and retouching, therefore the protection scope of the disclosure can be made Subject to view as defined in claim.

Claims (10)

1. a kind of electronic device, characterized by comprising:
One camera unit, to shoot multiple images of a physical environment;
One display, to show an augmented reality image, wherein the content of the augmented reality image include a dummy object with And the object corresponding to the physical environment;And
One processor, is electrically connected to the camera unit and the display, the processor to:
Splice those images to generate an environmental images;
An environment light and the reflection of the corresponding physical environment of the dummy object are calculated according to the environmental images;
The direction of a light source is analyzed according to the environmental images to calculate the corresponding shade of the dummy object;And
The dummy object is rendered on the augmented reality image according to the environment light, the reflection and the shade.
2. electronic device as described in claim 1, which is characterized in that the processor is also to execute following steps:
Calculate a total brightness of the environmental images;
The environmental images are distinguished into multiple regions according to the total brightness, wherein the two degrees in those regions are equal to each other;
Those regions are sorted according to the size in each region to generate a ranking results;
Select one of those regions according to the ranking results, and generate correspond at least one party in choice area domain to Light;
According to the position in choice area domain and the position of the dummy object calculate at least one party to the corresponding light source of light to Amount;And
The corresponding shade of the dummy object is generated according to the light source vector.
3. it is a kind of show augmented reality scene method, characterized by comprising:
Multiple images in a physical environment are shot by a camera unit;
Splice those images by a processor to generate an environmental images;
Device calculates an environment light and one for the corresponding physical environment of the dummy object according to the environmental images through this process Reflection;
Device analyzes the direction of a light source according to the environmental images to calculate the corresponding shade of the dummy object through this process;
Device renders the virtual object on an augmented reality image according to the environment light, the reflection and the shade through this process Body;And
Show the augmented reality image by a display, wherein the content of the augmented reality image include a dummy object and An object corresponding to the physical environment.
4. the method for display augmented reality scene as claimed in claim 3, which is characterized in that when a viewpoint and multiple matching mesh When one of target matches, the camera unit is to shoot one of those images, and wherein the viewpoint is with the electronics Device is mobile and rotates, and those matching targets are around the viewpoint.
5. the method for display augmented reality scene as claimed in claim 3, which is characterized in that the processor comes to receive Those images of the camera unit, wherein each image includes a position of the corresponding camera unit.
6. the method for display augmented reality scene as claimed in claim 5, which is characterized in that device splices those through this process Image also includes to generate the environmental images:
Device calculates a product of current image according to the distance between a current location and an initial position through this process Matter;And
If the quality of current image is greater than a threshold value, by current image joint into the environmental images;
Wherein, one of those images are an initial image and the initial image includes the corresponding camera unit The initial position.
7. the method for display augmented reality scene as claimed in claim 3, which is characterized in that device is according to the ring through this process Border image calculates the environment light of the corresponding physical environment of the dummy object, also includes:
Device calculates the environment light corresponding with the dummy object using a spheric harmonic function according to the environmental images through this process.
8. the method for display augmented reality scene as claimed in claim 4, which is characterized in that device is according to the ring through this process Border image calculates the reflection of the corresponding physical environment of the dummy object, also includes:
Device generates a cube graph according to the environmental images through this process, and is calculated and the virtual object according to the cube graph The corresponding reflection of body.
9. the method for display augmented reality scene as claimed in claim 3, which is characterized in that device is according to the ring through this process The direction of the border image analysing computer light source also includes to calculate the corresponding shade of the dummy object:
Calculate a total brightness of the environmental images;
The environmental images are distinguished into multiple regions according to the total brightness, wherein the two degrees in those regions are equal to each other;
Those regions are sorted according to the size in each region to generate a ranking results;
Select one of those regions according to the ranking results, and generate correspond at least one party in choice area domain to Light;
According to the position in choice area domain and the position of the dummy object calculate at least one party to the corresponding light source of light to Amount;And
The corresponding shade of the dummy object is generated according to the light source vector.
10. a kind of non-transient computer readable media includes an at least instruction repertorie, which is executed by the processor Program with carry out a display augmented reality scene method, characterized by comprising:
Multiple images in a physical environment are shot by a camera unit;
Splice those images by a processor to generate an environmental images;
Device calculates an environment light and one for the corresponding physical environment of the dummy object according to the environmental images through this process Reflection;
Device analyzes the direction of a light source according to the environmental images to calculate the corresponding shade of the dummy object through this process;
Device renders the virtual object on an augmented reality image according to the environment light, the reflection and the shade through this process Body;And
Show the augmented reality image by a display, wherein the content of the augmented reality image include a dummy object and An object corresponding to the physical environment.
CN201910049440.9A 2018-01-19 2019-01-18 Electronic device, method for displaying augmented reality scene and computer readable medium Active CN110070621B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862619130P 2018-01-19 2018-01-19
US62/619,130 2018-01-19

Publications (2)

Publication Number Publication Date
CN110070621A true CN110070621A (en) 2019-07-30
CN110070621B CN110070621B (en) 2023-04-07

Family

ID=65138884

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910049440.9A Active CN110070621B (en) 2018-01-19 2019-01-18 Electronic device, method for displaying augmented reality scene and computer readable medium

Country Status (4)

Country Link
US (1) US10636200B2 (en)
EP (1) EP3514766A3 (en)
CN (1) CN110070621B (en)
TW (1) TWI711966B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113393516A (en) * 2021-06-17 2021-09-14 北京房江湖科技有限公司 Method and apparatus for breaking up virtual objects in an AR scene
CN114979457A (en) * 2021-02-26 2022-08-30 华为技术有限公司 Image processing method and related device

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7003994B2 (en) * 2017-08-08 2022-01-21 ソニーグループ株式会社 Image processing equipment and methods
US10607567B1 (en) 2018-03-16 2020-03-31 Amazon Technologies, Inc. Color variant environment mapping for augmented reality
US10777010B1 (en) 2018-03-16 2020-09-15 Amazon Technologies, Inc. Dynamic environment mapping for augmented reality
US10559121B1 (en) * 2018-03-16 2020-02-11 Amazon Technologies, Inc. Infrared reflectivity determinations for augmented reality rendering
US10930049B2 (en) * 2018-08-27 2021-02-23 Apple Inc. Rendering virtual objects with realistic surface properties that match the environment
US11620794B2 (en) * 2018-12-14 2023-04-04 Intel Corporation Determining visually reflective properties of physical surfaces in a mixed reality environment
CN111489448A (en) * 2019-01-24 2020-08-04 宏达国际电子股份有限公司 Method for detecting real world light source, mixed reality system and recording medium
CN111199573B (en) * 2019-12-30 2023-07-07 成都索贝数码科技股份有限公司 Virtual-real interaction reflection method, device, medium and equipment based on augmented reality
CN111639613B (en) * 2020-06-04 2024-04-16 上海商汤智能科技有限公司 Augmented reality AR special effect generation method and device and electronic equipment
CN112562051B (en) * 2020-11-30 2023-06-27 腾讯科技(深圳)有限公司 Virtual object display method, device, equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070236485A1 (en) * 2006-03-31 2007-10-11 Microsoft Corporation Object Illumination in a Virtual Environment
US20090046099A1 (en) * 2006-11-13 2009-02-19 Bunkspeed Real-time display system
US20140125668A1 (en) * 2012-11-05 2014-05-08 Jonathan Steed Constructing augmented reality environment with pre-computed lighting
US20140240354A1 (en) * 2013-02-28 2014-08-28 Samsung Electronics Co., Ltd. Augmented reality apparatus and method
CN104766270A (en) * 2015-03-20 2015-07-08 北京理工大学 Virtual and real lighting fusion method based on fish-eye lens
US20150301592A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Utilizing totems for augmented or virtual reality systems
WO2016011788A1 (en) * 2014-07-24 2016-01-28 央数文化(上海)股份有限公司 Augmented reality technology-based handheld reading device and method thereof
CN107093204A (en) * 2017-04-14 2017-08-25 苏州蜗牛数字科技股份有限公司 It is a kind of that the method for virtual objects effect of shadow is influenceed based on panorama

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2072947B1 (en) * 2007-08-07 2016-03-16 Panasonic Intellectual Property Management Co., Ltd. Image processing device and image processing method
US9355433B1 (en) * 2015-06-30 2016-05-31 Gopro, Inc. Image stitching in a multi-camera array
KR20180099026A (en) * 2017-02-28 2018-09-05 삼성전자주식회사 Photographing method using external electronic device and electronic device supporting the same

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070236485A1 (en) * 2006-03-31 2007-10-11 Microsoft Corporation Object Illumination in a Virtual Environment
US20090046099A1 (en) * 2006-11-13 2009-02-19 Bunkspeed Real-time display system
US20140125668A1 (en) * 2012-11-05 2014-05-08 Jonathan Steed Constructing augmented reality environment with pre-computed lighting
US20140240354A1 (en) * 2013-02-28 2014-08-28 Samsung Electronics Co., Ltd. Augmented reality apparatus and method
US20150301592A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Utilizing totems for augmented or virtual reality systems
US20150301599A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Eye tracking systems and method for augmented or virtual reality
WO2016011788A1 (en) * 2014-07-24 2016-01-28 央数文化(上海)股份有限公司 Augmented reality technology-based handheld reading device and method thereof
CN104766270A (en) * 2015-03-20 2015-07-08 北京理工大学 Virtual and real lighting fusion method based on fish-eye lens
CN107093204A (en) * 2017-04-14 2017-08-25 苏州蜗牛数字科技股份有限公司 It is a kind of that the method for virtual objects effect of shadow is influenceed based on panorama

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PAUL DEVEVEC: "Image-Based Lighting", 《IEEE COMPUTER GRAPHICS AND APPLICATIONS 》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979457A (en) * 2021-02-26 2022-08-30 华为技术有限公司 Image processing method and related device
CN114979457B (en) * 2021-02-26 2023-04-07 华为技术有限公司 Image processing method and related device
CN113393516A (en) * 2021-06-17 2021-09-14 北京房江湖科技有限公司 Method and apparatus for breaking up virtual objects in an AR scene

Also Published As

Publication number Publication date
US20190228568A1 (en) 2019-07-25
EP3514766A2 (en) 2019-07-24
TWI711966B (en) 2020-12-01
TW201933084A (en) 2019-08-16
US10636200B2 (en) 2020-04-28
EP3514766A3 (en) 2019-08-14
CN110070621B (en) 2023-04-07

Similar Documents

Publication Publication Date Title
CN110070621A (en) Electronic device, the method and computer readable media for showing augmented reality scene
Kruijff et al. Perceptual issues in augmented reality revisited
CA2282637C (en) Method for rendering shadows on a graphical display
US20170061675A1 (en) Point and click lighting for image based lighting surfaces
Li et al. Physically-based editing of indoor scene lighting from a single image
Lombardi et al. Radiometric scene decomposition: Scene reflectance, illumination, and geometry from rgb-d images
EP3329464A1 (en) Robust attribute transfer for character animation
US9183654B2 (en) Live editing and integrated control of image-based lighting of 3D models
CN108509887A (en) A kind of acquisition ambient lighting information approach, device and electronic equipment
CN114175097A (en) Generating potential texture proxies for object class modeling
CN113658316B (en) Rendering method and device of three-dimensional model, storage medium and computer equipment
CN111275731B (en) Projection type physical interaction desktop system and method for middle school experiments
CN109523622A (en) A kind of non-structured light field rendering method
CN112669436A (en) Deep learning sample generation method based on 3D point cloud
Ma et al. Neural compositing for real-time augmented reality rendering in low-frequency lighting environments
EP3057316B1 (en) Generation of three-dimensional imagery to supplement existing content
Knecht et al. A framework for perceptual studies in photorealistic augmented reality
Chen et al. Rapid creation of photorealistic virtual reality content with consumer depth cameras
Korn et al. Interactive augmentation of live images using a hdr stereo camera
Chen et al. Dynamic omnidirectional texture synthesis for photorealistic virtual content creation
Chen et al. View-dependent virtual reality content from RGB-D images
Steinicke et al. Virtual reflections and virtual shadows in mixed reality environments
Bach et al. Vision-based hand representation and intuitive virtual object manipulation in mixed reality
CN110390686A (en) Naked eye 3D display method and system
Chen et al. Virtual content creation using dynamic omnidirectional texture synthesis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant