CN107635119A - Projective techniques and equipment - Google Patents
Projective techniques and equipment Download PDFInfo
- Publication number
- CN107635119A CN107635119A CN201610052018.5A CN201610052018A CN107635119A CN 107635119 A CN107635119 A CN 107635119A CN 201610052018 A CN201610052018 A CN 201610052018A CN 107635119 A CN107635119 A CN 107635119A
- Authority
- CN
- China
- Prior art keywords
- image
- transparency
- brightness
- pixel
- definition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Abstract
This application provides a kind of projective techniques and equipment, it is related to display field.Methods described includes:Determine the transparency of the medium between the image of a projection and at least eyes of user;It is less than a threshold value in response to the transparency, reduces brightness and/or the definition of the image.Methods described and equipment, in the case that the transparency of medium between the image and eyes of user of projection is less than a threshold value, brightness and/or the definition of the image are reduced, so that the image preferably and surroundings, can improve the sense of reality of the image.
Description
Technical field
The application is related to display technology field, more particularly to a kind of projective techniques and equipment.
Background technology
With the development of Display Technique, image can be presented to user in a manner of true to nature all the more
Eyes, be obviously improved the visual experience of people.Augmented reality, i.e. AR, it is to send out in recent years
Open up a kind of relatively rapid Display Technique.The technology is a kind of position for calculating camera image in real time
Put and angle and plus the technology of respective image, the target of this technology is virtual on screen
The world is enclosed on real world and carries out interaction.
Fig. 1 is the display schematic diagram of a scenario of an augmented reality.The light projected from a light source 110
L1 is reflected after reaching an interface 120, forms reflection light L2, and reflection light L2 enters
The eyes 130 of user.The interface 120 can launch light also can throw light.User's meeting
Feel there is the image 140 of a projection at a certain projection plane 150 a long way off.Due to light simultaneously
The projection plane 150 is not reached really, the image 140 is a virtual image.If user's
Haze in surrounding air be present, then the real-world object around the target location can be due to haze
Block and seem fuzzy.But because the image 140 is the virtual image, will not be blocked by haze,
Therefore it can show very lofty, user is felt untrue.
The content of the invention
The purpose of the application is:A kind of projective techniques and equipment are provided, to improve the image of projection
The sense of reality.
According to the first aspect of at least one embodiment of the application, there is provided a kind of projective techniques,
Methods described includes:
Determine the transparency of the medium between the image of a projection and at least eyes of user;
It is less than a threshold value in response to the transparency, reduces brightness and/or the definition of the image.
With reference to any possible implementation of first aspect, second of possible realization side
In formula, the medium includes mist and/or haze.
With reference to any possible implementation of first aspect, in the third possible realization side
It is described to determine the transparent of the medium between an image projected and at least eyes of user in formula
Degree includes:
The transparency is determined according to air quality.
With reference to any possible implementation of first aspect, the 4th kind of possible realization side
It is described to determine the transparent of the medium between an image projected and at least eyes of user in formula
Degree includes:
The transparency is determined according to atmospheric visibility.
With reference to any possible implementation of first aspect, the 5th kind of possible realization side
In formula, methods described also includes:
It is determined that the gaze distance of at least eyes to the image.
With reference to any possible implementation of first aspect, the 6th kind of possible realization side
In formula, the brightness for reducing the image and/or definition include:
According to the brightness of the transparency and the gaze distance reduction image and/or clearly
Degree.
With reference to any possible implementation of first aspect, the 7th kind of possible realization side
In formula, methods described also includes:
One is determined with reference to the first couple between transparency, a reference gaze distance and a reference brightness
It should be related to.
With reference to any possible implementation of first aspect, the 8th kind of possible realization side
In formula, methods described also includes:
Determine one with reference to transparency, one with reference to gaze distance and one with reference to the between blur radius
Two corresponding relations.
With reference to any possible implementation of first aspect, the 9th kind of possible realization side
In formula, the brightness for reducing the image and/or definition include:
According to the depth information of the image and the gaze distance, a picture of the image is determined
First simulated range of first simulation plane at least eyes described in where plain;
According to the transparency and first simulated range reduce the pixel brightness and/or
Definition.
With reference to any possible implementation of first aspect, the tenth kind of possible realization side
In formula, an at least eyes include left eye and right eye, and the image is corresponding including the left eye
Left-eye images and the right eye corresponding to right-eye image;
Methods described also includes:Determine a left eye pixel and the right eye shadow in the left-eye images
A parallax as between a right-eye pixel, the left eye pixel and the right-eye pixel correspond to same
One composite pixel;
The brightness for reducing the image and/or definition include:According to the parallax and described
Gaze distance, the second simulation plane where determining the composite pixel to an at least eyes
Second simulated range;
The left eye pixel and described is reduced according to the transparency and second simulated range
The brightness of right-eye pixel and/or definition.
With reference to any possible implementation of first aspect, in a kind of the tenth possible realization
In mode, the brightness for reducing the image includes:
By reducing the power for the light source for being used to project the image, to reduce the image
Brightness.
With reference to any possible implementation of first aspect, in the 12nd kind of possible realization
In mode, the definition for reducing the image includes:
By carrying out Fuzzy Processing to an at least pixel for the image, to reduce the image
Definition.
According to the second aspect of at least one embodiment of the application, there is provided a kind of device for projecting,
The equipment includes:
One first determining module, for determine one projection image and user an at least eyes it
Between medium transparency;
One reduces module, for being less than a threshold value in response to the transparency, reduces the image
Brightness and/or definition.
With reference to any possible implementation of second aspect, second of possible realization side
In formula, first determining module, for determining the transparency according to air quality.
With reference to any possible implementation of second aspect, in the third possible realization side
In formula, first determining module, for determining the transparency according to atmospheric visibility.
With reference to any possible implementation of second aspect, the 4th kind of possible realization side
In formula, the equipment also includes:
One second determining module, for determine an at least eyes to the image watch attentively away from
From.
With reference to any possible implementation of second aspect, the 5th kind of possible realization side
In formula, the reduction module, described in being reduced according to the transparency and the gaze distance
The brightness of image and/or definition.
With reference to any possible implementation of second aspect, the 6th kind of possible realization side
In formula, the equipment also includes:
One first corresponding relation determining module, for determining that one watches attentively with reference to transparency, a reference
The first corresponding relation between distance and a reference brightness.
With reference to any possible implementation of second aspect, the 7th kind of possible realization side
In formula, the equipment also includes:
One second corresponding relation determining module, for determining that one watches attentively with reference to transparency, a reference
Distance and one is with reference to the second corresponding relation between blur radius.
With reference to any possible implementation of second aspect, the 8th kind of possible realization side
In formula, the reduction module includes:
One first simulated range determining unit, for the depth information according to the image and described
Gaze distance, first where determining a pixel of the image simulates plane to described at least one
First simulated range of eyes;
One reduces unit, for being less than a threshold value in response to the transparency, according to described transparent
Degree and first simulated range reduce brightness and/or the definition of the pixel.
With reference to any possible implementation of second aspect, the 9th kind of possible realization side
In formula, an at least eyes include left eye and right eye, and the image is corresponding including the left eye
Left-eye images and the right eye corresponding to right-eye image;
The equipment also includes:
One the 3rd determining module, for determining a left eye pixel and the right side in the left-eye images
A parallax in eye shadow picture between a right-eye pixel, the left eye pixel and the right-eye pixel pair
Answer same composite pixel;
The reduction module includes:
One second simulated range determining unit, for according to the parallax and the gaze distance,
Second simulation of the second simulation plane to an at least eyes where determining the composite pixel
Distance;
One reduces unit, described in being reduced according to the transparency and second simulated range
The brightness of left eye pixel and the right-eye pixel and/or definition.
With reference to any possible implementation of second aspect, the tenth kind of possible realization side
In formula, the reduction module includes:
One luminance-reduction unit, for the work(by reducing the light source for being used to project the image
Rate, to reduce the brightness of the image.
With reference to any possible implementation of second aspect, in a kind of the tenth possible realization
In mode, the reduction module includes:
One definition reduces unit, for by being obscured to an at least pixel for the image
Processing, to reduce the definition of the image.
According to the third aspect of at least one embodiment of the application, there is provided a kind of user equipment,
The user equipment includes:
One memory, for store instruction;
One processor, for performing the instruction of the memory storage, the instruction causes described
Operated below computing device:
Determine the transparency of the medium between the image of a projection and at least eyes of user;
It is less than a threshold value in response to the transparency, reduces brightness and/or the definition of the image.
Methods described and equipment, the transparency of the medium between the image and eyes of user of projection
In the case of less than a threshold value, brightness and/or the definition of the image are reduced, so that described
Image preferably and surroundings, can improve the sense of reality of the image.
Brief description of the drawings
Fig. 1 is the display schematic diagram of a scenario of augmented reality in the application one embodiment;
Fig. 2 is the flow chart of projective techniques in one embodiment of the application;
Fig. 3 is the display schematic diagram of a scenario of augmented reality in the application another embodiment;
Fig. 4 is Fig. 3 position relationship rough schematic view;
Fig. 5 is the display schematic diagram of a scenario of augmented reality in the application another embodiment;
Fig. 6 is the module map of device for projecting in the application one embodiment;
Fig. 7 is the module map that module is reduced in one embodiment of the application;
Fig. 8 is the module map that module is reduced in the application another embodiment;
Fig. 9 is the module map of equipment described in the application another embodiment;
Figure 10 is the module map of equipment described in the application another embodiment;
Figure 11 is the module map of equipment described in the application another embodiment;
Figure 12 is the module map that module is reduced described in the application another embodiment;
Figure 13 is the module map of equipment described in the application another embodiment;
Figure 14 is the hardware architecture diagram of user equipment described in the application another embodiment.
Embodiment
With reference to the accompanying drawings and examples, it is further detailed to the embodiment work of the application
Explanation.Following examples are used to illustrate the application, but are not limited to scope of the present application.
It will be appreciated by those skilled in the art that in embodiments herein, the sequence number of following each steps
Size be not meant to the priority of execution sequence, the execution sequence of each step should with its function and
Internal logic determines that the implementation process without tackling the embodiment of the present application forms any restriction.
Fig. 2 is the flow chart of projective techniques described in the application one embodiment, and methods described can be with
For example realized on a device for projecting.As shown in Fig. 2 methods described includes:
S220:Determine the transparent of the medium between the image of a projection and at least eyes of user
Degree;
S240:Be less than a threshold value in response to the transparency, reduce the image brightness and/or
Definition.
The embodiment of the present application methods described, medium between the image and eyes of user of projection
In the case that transparency is less than a threshold value, brightness and/or the definition of the image are reduced, so as to
Allow the image preferably and surroundings, improve the sense of reality of the image.
Describe the work(of the step S220 and S240 in detail below with reference to embodiment
Energy.
S220:Determine the transparent of the medium between the image of a projection and at least eyes of user
Degree.
The image refers to that after a light source projects user feels exist at a certain locus
Image.As shown in figure 1, the image can be a virtual image, i.e., the light that light source is sent is simultaneously
The locus is not reached really, described in simply user can feel to exist at the locus
Image.
The medium can include arbitrarily propagating between the image and the eyes of user
The material of light, for example air, mist, haze, cigarette, dirt, water etc. can be included.Due to simple
Air there is no color, the transparency of the medium can't be caused to reduce.Therefore, the application
It is primarily upon blocking or during the material of half occlusion effect when some in the medium being present and have
Situation, such as when mist, haze, cigarette, dirt etc. in the medium be present.
The transparency of the medium refers to the ability of the medium transmission visible ray.The transparency
Higher, then the ability of the medium transmission visible ray is stronger;The transparency is lower, then described
The ability of medium transmission visible ray is lower.
In one embodiment, the medium is primarily referred to as the atmospheric environment around user, from
And the transparency of the medium can be atmospheric transparency, so as to be determined greatly according to existing
The method of gas transparency determines the transparency of the medium.
In addition, atmospheric visibility is an important indicator of atmospheric transparency, in a kind of embodiment party
In formula, the transparency, the more high then institute of atmospheric visibility can also be determined according to atmospheric visibility
It is higher to state transparency, the more low then described transparency of atmospheric visibility is lower.Specifically, can be straight
Connect and represent the transparency with the numerical value of atmospheric visibility, the number of atmospheric visibility can also be determined
Value and the direct corresponding relation of the transparency.
In another embodiment, the transparency can also be determined according to air quality, it is empty
The more good then described transparency of makings amount is higher, and the more poor then described transparency of air quality is lower.Institute
State the quality of air quality such as can be reflected by air quality index.
In another embodiment, the medium can also be determined according to customized standard
Transparency, for example assume that the medium is mainly water, the white of 25 centimetres of diameter can be taken
Color disk, sinks in water, watches it attentively, untill invisible.At this moment the depth that disk sinks
Degree, it is exactly the transparency of the medium.
S240:Be less than a threshold value in response to the transparency, reduce the image brightness and/or
Definition.
Wherein, the threshold value can be set according to actual conditions, such as direct in the transparency
In the case of using atmospheric visibility, the threshold value can be arranged to 2 meters, i.e., in response to big
Gas visibility is less than 2 meters, reduces brightness and/or the definition of the image.
It is described to reduce the brightness of the image, that is, reduce the image and the light intensity of human eye is pierced
Swash.In one embodiment, can be by reducing the light source for being used for projecting the image
Power, to reduce the brightness of the image.As shown in Figure 1, it is assumed that the work(of the light source 110
Rate is lowered, then reflection light L2 intensity is lowered, and human eye can feel the bright of image 140
Degree reduces, while human eye can more see the real-world object behind interface 120, if empty
There is mist in gas, if then user can feel that image 140 is hidden as existing in mist.
It is described to reduce the definition of the image, that is, the image is taken a fancy to deblurring.
In a kind of embodiment, can by carrying out Fuzzy Processing at least pixel of the image,
To reduce the definition of the image.Wherein, the Fuzzy Processing can such as use average mould
Paste algorithm or Gaussian Blur algorithm etc..After Fuzzy Processing, human eye can also feel image 140 by
The blocking of mist.
In one embodiment, this step can reduce the image according to unified standard
Brightness and/or definition, for example once the transparency is less than the threshold value, then by the image
The lumen of luminance-reduction 10.It is exactly to described in fact when reducing the definition of the image
Image carries out Fuzzy Processing, for example once the transparency is less than the threshold value, then according to predetermined
Blur radius to the image carry out Fuzzy Processing.
Above-mentioned processing mode is simple and easy, but without specifically position of the consideration in the image
The degree that the real-world object at place is actually blocked, its result may still allow user to feel institute
Real scene around can not be preferably blended in by stating image.In one embodiment, in order to enter
One step improves the sense of reality, and methods described also includes:
S231:It is determined that the gaze distance of at least eyes to the image.
Wherein, projected where the gaze distance i.e. an at least eyes to the image
The distance of plane, the projection plane where the image are the throwings by projecting the light source of the image
Parameter decision is penetrated, is the throwing where can determine that the image according to projection parameter in other words
Shadow plane, you can determine the gaze distance.As shown in figure 1, eyes 130 arrive projection plane
150 distance is that the gaze distance is D in Fig. 1g.In addition, relative to Dg, eyes 130
To interface 120 sight intersection point it is closer to the distance, the gaze distance can also be approximately equal to institute
State the D ' into distance i.e. Fig. 1 of the sight intersection point of projection plane 150g。
In one embodiment, the brightness for reducing the image and/or definition can wrap
Include:
S240a:According to the transparency and the gaze distance reduce the image brightness and/
Or definition.
Exemplified by reducing the brightness of the image, methods described can predefine one with reference to transparent
Degree, one are with reference to the first corresponding relation between gaze distance and a reference brightness.Specifically, than
Such as can in the case of environment transparency highest, in the case where the object distance of distance one is zero,
The brightness for obtaining the object is intrinsic brilliance;Then environment transparency and detecting distance are changed, point
Not in the environment of different transparencies, the brightness of the object is detected at different detecting distances, point
The ratio of multiple brightness and intrinsic brilliance is not obtained;Then can be using each transparency as a ginseng
Examine transparency, each detecting distance refers to gaze distance as one, by corresponding ratio with it is described
The product of the high-high brightness of image with reference to transparency and described refers to gaze distance as with described
Corresponding reference brightness.
It is described with reference to transparency, it is described with reference between gaze distance and the reference brightness first
Corresponding relation can be stored in a form, can be according to described in the step S240a
Lightness and the gaze distance inquire about the form, to determine corresponding reference brightness, then by institute
The luminance-reduction of image is stated to the reference brightness.
When reducing the definition of the image, methods described can predefine one with reference to transparent
Degree, one are with reference to gaze distance and one with reference to the second corresponding relation between blur radius.Specifically,
For example the highest definition of an original image can be obtained as its actual definition, and corresponding mould
It is zero to paste radius;Then environment transparency and detecting distance are changed, respectively in different transparencies
In environment, the example image of the shooting acquisition original image, detection are every at different detecting distances
The definition of individual example image, while Fuzzy Processing is carried out respectively to the original image, make processing
The definition of image afterwards is equal with the definition of each example image, records each Fuzzy Processing
Corresponding blur radius;Then can be using each transparency as a reference transparency, Mei Gejian
Ranging refers to gaze distance from as one, using corresponding blur radius as transparent with the reference
Degree and the reference blur radius corresponding with reference to gaze distance.
It is described with reference to transparency, it is described with reference to gaze distance and described with reference between blur radius
Second corresponding relation can be stored in a form, can be according to institute in the step S240a
State transparency and the gaze distance inquires about the form, to determine to refer to blur radius accordingly,
Then the definition of the image is reduced with reference to blur radius according to described.
It can be handled in general bidimensional image according to above-mentioned embodiment, to improve
The sense of reality of image.But in some cases, it is described although the image is bidimensional image
Image has certain level in itself, for example the image is a big building, when user sees
See during the building can nature think the region of building orientation oneself from oneself closer to and backwards to oneself
Region it is farther from oneself.In this case, it is past if whole image to be done to unified processing
It is past also user to be allowed to feel untrue.Therefore, in another embodiment, the reduction institute
The brightness and/or definition for stating image include:
S241b:According to the depth information of the image and the gaze distance, the image is determined
A pixel where one first simulation plane arrive described in an at least eyes the first simulated range;
S242b:The bright of the pixel is reduced according to the transparency and first simulated range
Degree and/or definition.
Wherein, the depth information of the image can be according to the former number for the image for generating the image
According to acquisition, for example the image is generated after a 3D rendering projects, then every on the image
The depth information of individual pixel can obtain according to the former data of the 3D rendering.The depth letter
Breath reflects position relationship of each pixel in the 3D rendering.
As shown in Figure 3, it is assumed that a cubical 3D rendering is projected by a light source (not shown)
Onto a projection plane 350, eyes of user 330 sees the cubical shadow by interface 320
As 340.Wherein, bold portion is the part that user actually sees in image 340, dotted line part
Divide the part for representing that user is envisioned that.The solid line with the arrow of the lower section of interface 320 represents actual
Into the light of eyes of user 330, the reverse extending dotted line of these light represents that user feels to deposit
Light.It is appreciated that image 340 here and a virtual image, and the projection is real
It is a 2D image on projection plane 350 in matter.
2 points of difference corresponding A pixels of A, B and B pixels on the image 340, although A pictures
Element and B pixels can be felt all on projection plane 350 according to cubical shape, user
Feel B pixels it is more closer, A pixels more farther out, so as to A pixel ratio B pixels by smog
Block that more serious some are just truer.
In the step S241b, the first simulated range of the pixel to an at least eyes,
First where namely user feels the pixel simulates plane to the distance of oneself eye, institute
Stating the first simulation plane may be in the front or behind of the projection plane.Using the pixel as B
Exemplified by pixel, it is assumed that understand B pixels and A pixels in institute according to the depth information of the image
State n pixel of difference on the vertical line direction of projection plane.Assume A pixels just in institute simultaneously
State on projection plane 350, then the first simulation plane where A pixels is that the projection is flat
Face, the first simulation plane where A pixels to the first simulated range of eyes of user is gaze distance
Dg。
Fig. 3 is carried out to obtain position relationship schematic diagram as shown in Figure 4 after simplifying processing, wherein
L represents the vertical line direction of the projection plane 350, DaRepresent that user feels pixel A place
First simulates plane to the first simulated range of eyes of user, DbRepresent that user feels pixel B institute
In the first simulation plane to the first simulated range of eyes of user, can be determined according to projection parameter
The pel spacing D of the image 340p, and then can obtain:
Db=Da-n×Dp=Dg-n×Dp (1)
It is similar, the according to where above-mentioned processing can obtain any pixel on the image 340
First simulated range of the one simulation plane to eyes of user.In addition, assume user in above-mentioned processing
Feel that pixel A is located just on the projection plane 350, this user can be felt firm
The pixel on the projection plane 350 is denoted as benchmark pixel well.Those skilled in the art manage
Solution, any pixel can be selected to select the image for the sake of simple as benchmark pixel
Point at geometric center is as benchmark pixel.
The realization principle of the step S242b and the realization principle of the step S240a are similar
Seemingly, the gaze distance is simply replaced with into first simulated range.For the sake of simplicity, only with
Reduce and illustrate exemplified by the brightness of the image, methods described can predefine one with reference to saturating
Lightness, one are with reference to the 3rd corresponding relation between the first simulated range and a reference brightness.Specifically
, for example can be zero in the image distance of distance one in the case of environment transparency highest
In the case of, obtain the image for intrinsic brilliance;Then environment transparency and detecting distance are changed,
Respectively in the environment of different transparencies, the brightness of the object is detected at different detecting distances,
Respectively obtain the ratio of multiple brightness and intrinsic brilliance;Then can be using each transparency as one
The first simulated range is referred to as one with reference to transparency, each detecting distance, by corresponding ratio
With the product of the high-high brightness of the image transparency and first mould are referred to as with described
The corresponding reference brightness of quasi-distance.
It is described with reference to transparency, it is described with reference between the first simulated range and the reference brightness
3rd corresponding relation can be stored in a form, can be according to institute in the step S242b
State transparency and first simulated range that refers to inquires about the form, to determine accordingly with reference to bright
Degree, then by the luminance-reduction of the image to the reference brightness.
In another embodiment, an at least eyes include left eye and right eye, the shadow
The right-eye image as corresponding to including left-eye images corresponding to the left eye and the right eye, the left side
Parallax is there may be between eye shadow picture and the right-eye image, so as to make user feel final
The image seen has stereoeffect.In present embodiment, in order to improve the sense of reality, methods described
It can also include:
232:Determine in the left-eye images right eye in a left eye pixel and the right-eye image
A parallax between pixel, the left eye pixel and the right-eye pixel correspond to same composite pixel.
The brightness for reducing the image and/or definition include:
S241c:According to the parallax and the gaze distance, the is determined where the composite pixel
Second simulated range of the two simulation planes to an at least eyes;
S242c:The left eye pixel is reduced according to the transparency and second simulated range
Brightness and/or definition with the right-eye pixel.
Wherein, the composite pixel, it is exactly that user feels the pixel finally seen, the synthesis
Pixel is that the left eye pixel and the right-eye pixel comprehensive visual to caused by user are imitated in fact
Fruit.
As shown in fig. 5, it is assumed that the left eye 531 of user sees a left side by the reflection at an interface 520
A left eye pixel Cl on eye shadow picture, right eye 532 see the right side by the reflection at the interface 520
A right-eye pixel Cr on eye shadow picture, it is assumed that left eye pixel Cl and right-eye pixel Cr are respectively positioned on throwing
In shadow plane 550, the parallax between left eye pixel Cl and right-eye pixel Cr is m pixel,
The composite pixel C that user finally feels can be located at the rear of the projection plane 550.Root
The pixel separation that left-eye images and right-eye image can be obtained according to projection parameter is Dp, and then can be with
The parallax distance obtained between left eye pixel Cl and right-eye pixel Cr is Ds=m × Dp.According to
Triangle etc. can obtain than relation:
Wherein, DeRepresent the distance between left eye and right eye, DcRepresent where composite pixel C the
Second simulated range of two simulation planes at least eyes described in, DgAn at least eyes arrive
The distance of projection plane 550.D can be calculated according to above-mentioned formulac。
The realization principle of the step S242c and the realization principle of the step S240a are similar
Seemingly, the gaze distance is simply replaced with into second simulated range.For the sake of simplicity, only with
Reduce and illustrate exemplified by the brightness of the image, methods described can predefine one with reference to saturating
Lightness, one are with reference to the 3rd corresponding relation between the second simulated range and a reference brightness.Specifically
, for example can be zero in the image distance of distance one in the case of environment transparency highest
In the case of, obtain the image for intrinsic brilliance;Then environment transparency and detecting distance are changed,
Respectively in the environment of different transparencies, the brightness of the object is detected at different detecting distances,
Respectively obtain the ratio of multiple brightness and intrinsic brilliance;Then can be using each transparency as one
The second simulated range is referred to as one with reference to transparency, each detecting distance, by corresponding ratio
With the product of the high-high brightness of the image transparency and second mould are referred to as with described
The corresponding reference brightness of quasi-distance.
It is described with reference to transparency, it is described with reference between the second simulated range and the reference brightness
3rd corresponding relation can be stored in a form, can be according to institute in the step S242c
State transparency and second simulated range that refers to inquires about the form, to determine accordingly with reference to bright
Degree, then by the luminance-reduction of the image to the reference brightness.
In addition, the embodiment of the present application also provides a kind of computer-readable medium, it is included in and is performed
The computer-readable instruction that below Shi Jinhang is operated:Perform in above-mentioned Fig. 2 illustrated embodiments
The step S220 and S240 of method operation.
To sum up, herein described method, the viewing that can be felt according to medium transparency and user
Distance, the brightness of the appropriate image for reducing projection and/or definition, so that the image energy of projection
Enough real-world objects preferably with surrounding blend, and can improve the sense of reality that augmented reality is shown.
Fig. 6 is the modular structure schematic diagram of device for projecting described in the embodiment of the present invention, the projection
Equipment can be arranged in a display device as One function module, or the device for projecting
A single equipment can also be used as to complete corresponding function by being communicated with a display device.Institute
Stating equipment 600 can include:
One first determining module 610, for determine one projection image and user at least at a glance
The transparency of medium between eyeball;
One reduces module 620, for being less than a threshold value in response to the transparency, described in reduction
The brightness of image and/or definition.
Equipment described in the embodiment of the present application, medium between the image and eyes of user of projection
In the case that transparency is less than a threshold value, brightness and/or the definition of the image are reduced, so as to
Allow the image preferably and surroundings, improve the sense of reality of the image.
Below in conjunction with embodiment, first determining module 610 and described is described in detail
Reduce the function of module 620.
First determining module 610, for determining at least the one of an image projected and user
The transparency of medium between eyes.
The image refers to that after a light source projects user feels exist at a certain locus
Image.As shown in figure 1, the image can be a virtual image, i.e., the light that light source is sent is simultaneously
The locus is not reached really, described in simply user can feel to exist at the locus
Image.
The medium can include arbitrarily propagating between the image and the eyes of user
The material of light, for example air, mist, haze, cigarette, dirt, water etc. can be included.Due to simple
Air there is no color, the transparency of the medium can't be caused to reduce.Therefore, the application
It is primarily upon blocking or during the material of half occlusion effect when some in the medium being present and have
Situation, such as when mist, haze, cigarette, dirt etc. in the medium be present.
The transparency of the medium refers to the ability of the medium transmission visible ray.The transparency
Higher, then the ability of the medium transmission visible ray is stronger;The transparency is lower, then described
The ability of medium transmission visible ray is lower.
In one embodiment, the medium is primarily referred to as the atmospheric environment around user, from
And the transparency of the medium can be atmospheric transparency, so as to be determined greatly according to existing
The method of gas transparency determines the transparency of the medium.
In addition, atmospheric visibility is an important indicator of atmospheric transparency, in a kind of embodiment party
In formula, first determining module 610, for determining the transparency according to atmospheric visibility.
The more high then described transparency of atmospheric visibility is higher, and the more low then described transparency of atmospheric visibility is more
It is low.Specifically, the transparency directly can be represented with the numerical value of atmospheric visibility, can also
Determine the numerical value and the direct corresponding relation of the transparency of atmospheric visibility.
In another embodiment, first determining module 610, for according to air matter
Amount determines the transparency.The more good then described transparency of air quality is higher, and air quality is poorer
Then the transparency is lower.The quality of the air quality can such as pass through air quality index
Reflection.
In another embodiment, the medium can also be determined according to customized standard
Transparency, for example assume that the medium is mainly water, the white of 25 centimetres of diameter can be taken
Color disk, sinks in water, watches it attentively, untill invisible.At this moment the depth that disk sinks
Degree, it is exactly the transparency of the medium.
The reduction module 620, for being less than a threshold value in response to the transparency, reduces institute
State brightness and/or the definition of image.
Wherein, the threshold value can be set according to actual conditions, such as direct in the transparency
In the case of using atmospheric visibility, the threshold value can be arranged to 2 meters, i.e., in response to big
Gas visibility is less than 2 meters, reduces brightness and/or the definition of the image.
It is described to reduce the brightness of the image, that is, reduce the image and the light intensity of human eye is pierced
Swash.In one embodiment, include referring to Fig. 7, the reduction module 620:One brightness
Reduce unit 621, for by reduce be used for project the image a light source power, with
Reduce the brightness of the image.
It is described to reduce the definition of the image, that is, the image is taken a fancy to deblurring.
In a kind of embodiment, referring to Fig. 8, the reduction module 620 includes:One definition reduces
Unit 622, for by carrying out Fuzzy Processing at least pixel of the image, to reduce
The definition of the image.
In one embodiment, the module 620 that reduces can reduce according to unified standard
The brightness of the image and/or definition, for example once the transparency is less than the threshold value, then
By the lumen of luminance-reduction 10 of the image.When reducing the definition of the image, in fact
It is exactly that Fuzzy Processing is carried out to the image, for example once the transparency is less than the threshold value,
Fuzzy Processing then is carried out to the image according to predetermined blur radius.
Above-mentioned processing mode is simple and easy, but without specifically position of the consideration in the image
The degree that the real-world object at place is actually blocked, its result may still allow user to feel institute
Real scene around can not be preferably blended in by stating image.In one embodiment, in order to enter
One step improves the sense of reality, and referring to Fig. 9, the equipment 600 also includes:
One second determining module 630, for determining the note of at least eyes to the image
The apparent distance.
Wherein, projected where the gaze distance i.e. an at least eyes to the image
The distance of plane, the projection plane where the image are the throwings by projecting the light source of the image
Parameter decision is penetrated, is the throwing where can determine that the image according to projection parameter in other words
Shadow plane, you can determine the gaze distance.
In one embodiment, the reduction module 620, for according to the transparency and
The gaze distance reduces brightness and/or the definition of the image.
First, exemplified by reducing the brightness of the image, referring to Figure 10, the equipment 600 is also
It can include:
One first corresponding relation determining module 640, for determining one with reference to transparency, a reference
The first corresponding relation between gaze distance and a reference brightness.It is described with reference to transparency, described
One can be stored in reference to the first corresponding relation between gaze distance and the reference brightness
In form, the reduction module 620 can inquire about according to the transparency and the gaze distance
The form, to determine corresponding reference brightness, then by the luminance-reduction of the image to described
Reference brightness.
In addition, when reducing the definition of the image, referring to Figure 11, the equipment 600 is also
Including:
One second corresponding relation determining module 650, for determining one with reference to transparency, a reference
Gaze distance and one is with reference to the second corresponding relation between blur radius.The reference transparency,
It is described to be deposited with second corresponding relation with reference between blur radius with reference to gaze distance
In a form, the reduction module 620 according to the transparency and described can watch attentively for storage
The Distance query form, to determine to refer to blur radius accordingly, then refer to mould according to described
Pasting radius reduces the definition of the image.
It can be handled in general bidimensional image according to above-mentioned embodiment, to improve
The sense of reality of image.But in some cases, it is described although the image is bidimensional image
Image has certain level in itself, for example the image is a big building, when user sees
See during the building can nature think the region of building orientation oneself from oneself closer to and backwards to oneself
Region it is farther from oneself.In this case, it is past if whole image to be done to unified processing
It is past also user to be allowed to feel untrue.Therefore, in another embodiment, referring to Figure 12,
The reduction module 620 includes:
One first simulated range determining unit 623, for the depth information according to the image and
The gaze distance, determine the first simulation plane where a pixel of the image arrive described in extremely
First simulated range of few eyes;
One reduces unit 624, for being less than a threshold value in response to the transparency, according to described
Transparency and first simulated range reduce brightness and/or the definition of the pixel.
Wherein, the depth information of the image can be according to the former number for the image for generating the image
According to acquisition, for example the image is generated after a 3D rendering projects, then every on the image
The depth information of individual pixel can obtain according to the former data of the 3D rendering.The depth letter
Breath reflects position relationship of each pixel in the 3D rendering.
As shown in Figure 3, it is assumed that a cubical 3D rendering is projected by a light source (not shown)
Onto a projection plane 350, eyes of user 330 sees the cubical shadow by interface 320
As 340.Wherein, bold portion is the part that user actually sees in image 340, dotted line part
Divide the part for representing that user is envisioned that.The solid line with the arrow of the lower section of interface 320 represents actual
Into the light of eyes of user 330, the reverse extending dotted line of these light represents that user feels to deposit
Light.It is appreciated that image 340 here and a virtual image, and the projection is real
It is a 2D image on projection plane 350 in matter.
2 points of difference corresponding A pixels of A, B and B pixels on the image 340, although A pictures
Element and B pixels can be felt all on projection plane 350 according to cubical shape, user
Feel B pixels it is more closer, A pixels more farther out, so as to A pixel ratio B pixels by smog
Block that more serious some are just truer.
First simulated range of the pixel at least eyes described in, that is, user feel institute
First where stating pixel simulates plane to the distance of oneself eye, and the first simulation plane can
Can be in the front or behind of the projection plane.So that the pixel is B pixels as an example, it is assumed that root
The vertical line of B pixels and A pixels in the projection plane is understood according to the depth information of the image
N pixel is differed on direction.A pixels are assumed just on the projection plane 350 simultaneously,
Then the first simulation plane where A pixels is the projection plane, first where A pixels
Simulate plane to the first simulated range of eyes of user be gaze distance Dg。
Fig. 3 is carried out to obtain position relationship schematic diagram as shown in Figure 4 after simplifying processing, its
Middle L represents the vertical line direction of the projection plane 350, DaRepresent that user feels pixel A institute
In the first simulation plane to the first simulated range of eyes of user, DbRepresent that user feels pixel
First simulates plane to the first simulated range of eyes of user where B, can according to projection parameter
To determine the pel spacing D of the image 340p, and then above-mentioned formula (1) can be obtained.
It is similar, the according to where above-mentioned processing can obtain any pixel on the image 340
First simulated range of the one simulation plane to eyes of user.In addition, assume user in above-mentioned processing
Feel that pixel A is located just on the projection plane 350, this user can be felt firm
The pixel on the projection plane 350 is denoted as benchmark pixel well.Those skilled in the art manage
Solution, any pixel can be selected to select the image for the sake of simple as benchmark pixel
Point at geometric center is as benchmark pixel.
According to the transparency and first simulated range reduce the pixel brightness and/or
The realization principle of definition is with described according to reducing the transparency and the gaze distance
The brightness of image and/or the realization principle of definition are similar, simply replace the gaze distance
It is changed to first simulated range.For the sake of simplicity, only to enter exemplified by reducing the brightness of the image
Row explanation, methods described can predefine one and refer to the first simulated range with reference to transparency, one
And the 3rd corresponding relation between reference brightness.Specifically, such as can be in environment transparency
In the case of highest, in the case where the image distance of distance one is zero, obtain the image for reality
Border brightness;Then environment transparency and detecting distance are changed, respectively in the environment of different transparencies
In, the brightness of the object is detected at different detecting distances, respectively obtains multiple brightness and reality
The ratio of brightness;Then can using each transparency as one with reference to transparency, each detection away from
The first simulated range is referred to from as one, by the high-high brightness of corresponding ratio and the image
Product as with described with reference to transparency and reference brightness that first simulated range is corresponding.
It is described with reference to transparency, it is described with reference between the first simulated range and the reference brightness
3rd corresponding relation can be stored in a form, can basis in the reduction unit 624
The transparency and first simulated range that refers to inquire about the form, to determine corresponding reference
Brightness, then by the luminance-reduction of the image to the reference brightness.
In another embodiment, an at least eyes include left eye and right eye, the shadow
The right-eye image as corresponding to including left-eye images corresponding to the left eye and the right eye, the left side
Parallax is there may be between eye shadow picture and the right-eye image, so as to make user feel final
The image seen has stereoeffect.In present embodiment, referring to Figure 13, in order to improve the sense of reality,
The equipment 600 also includes:
One the 3rd determining module 650, for determining a left eye pixel and institute in the left-eye images
State the parallax between a right-eye pixel in right-eye image, the left eye pixel and the right eye picture
The corresponding same composite pixel of element;
The reduction module 620 includes:
One second simulated range determining unit 625, for according to the parallax and it is described watch attentively away from
From second mould of the second simulation plane to an at least eyes where determining the composite pixel
Quasi-distance;
One reduces unit 626, for according to the transparency and second simulated range reduction
Brightness and/or the definition of the left eye pixel and the right-eye pixel.
Wherein, the composite pixel, it is exactly that user feels the pixel finally seen, the synthesis
Pixel is that the left eye pixel and the right-eye pixel comprehensive visual to caused by user are imitated in fact
Fruit.
As shown in fig. 5, it is assumed that the left eye 531 of user is seen by the reflection at an interface 520
A left eye pixel Cl in left-eye images, right eye 532 are seen by the reflection at the interface 520
A right-eye pixel Cr on right-eye image, it is assumed that left eye pixel Cl and right-eye pixel Cr are respectively positioned on
On projection plane 550, the parallax between left eye pixel Cl and right-eye pixel Cr is m picture
Element, after the composite pixel C that user finally feels can be located at the projection plane 550
Side.The pixel separation that left-eye images and right-eye image can be obtained according to projection parameter is Dp,
And then it is D that can obtain the distance of the parallax between left eye pixel Cl and right-eye pixel Crs=m ×
Dp.Above-mentioned formula (2) can be obtained than relation according to triangle etc., according to above-mentioned formula
(2) D can be calculatedc。
It is described according to the transparency and second simulated range reduce the left eye pixel and
The brightness of the right-eye pixel and/or the realization principle of definition with it is described according to the transparency
The realization principle of brightness and/or definition with the gaze distance reduction image is similar,
The gaze distance is simply replaced with into second simulated range, and two pixels are carried out
Adjustment.For the sake of simplicity, only to be illustrated exemplified by reducing the brightness of the image, methods described
One can be predefined with reference to transparency, one with reference between the second simulated range and a reference brightness
The 3rd corresponding relation.Specifically, such as can in the case of environment transparency highest,
In the case that the image distance of distance one is zero, obtain the image for intrinsic brilliance;Then change
Environment transparency and detecting distance, respectively in the environment of different transparencies, different detections away from
From the brightness that the object is detected at place, the ratio of multiple brightness and intrinsic brilliance is respectively obtained;Then
Using each transparency as one second can be referred to reference to transparency, each detecting distance as one
Simulated range, using the product of corresponding ratio and the high-high brightness of the image as with the ginseng
Examine transparency and the corresponding reference brightness of second simulated range.
It is described with reference to transparency, it is described with reference between the second simulated range and the reference brightness
3rd corresponding relation can be stored in a form, and the reduction unit 626 can be according to institute
State transparency and second simulated range that refers to inquires about the form, to determine accordingly with reference to bright
Degree, then by the luminance-reduction of the image to the reference brightness.
To sum up, herein described equipment, the viewing that can be felt according to medium transparency and user
Distance, the brightness of the appropriate image for reducing projection and/or definition, so that the image energy of projection
Enough real-world objects preferably with surrounding blend, and can improve the sense of reality that augmented reality is shown.
The hardware configuration of user equipment is as shown in figure 14 described in the application one embodiment.This Shen
Please specific implementation of the specific embodiment not to the user equipment limit, referring to Figure 14, institute
Stating user equipment 1400 can include:
Processor (processor) 1410, communication interface (Communications Interface) 1420,
Memory (memory) 1430, and communication bus 1440.Wherein:
Processor 1410, communication interface 1420, and memory 1430 pass through communication bus
1440 complete mutual communication.
Communication interface 1420, for being communicated with other network elements.
Processor 1410, for configuration processor 1432, it can specifically perform shown in above-mentioned Fig. 1
Embodiment of the method in correlation step.
Specifically, program 1432 can include program code, and described program code includes calculating
Machine operational order.
Processor 1410 is probably a central processor CPU, or specific integrated circuit
ASIC (Application Specific Integrated Circuit), or be arranged to implement this
Apply for one or more integrated circuits of embodiment.
Memory 1430, for depositing program 1432.Memory 1430 may include at a high speed
RAM memory, it is also possible to also including nonvolatile memory (non-volatile memory),
A for example, at least magnetic disk storage.Program 1432 can specifically perform following steps:
Determine the deformation parameter of a unit area in the viewing area of a display device;
According at least to the deformation parameter, the security parameter of the unit area is determined;
According at least to the security parameter, the subregion belonging to the unit area and institute are determined
The safe class of subregion is stated, the viewing area includes the subregion.
The specific implementation of each step may refer to the corresponding step in above-described embodiment in program 1432
Rapid or module, will not be described here.It is apparent to those skilled in the art that it is
The convenience of description and succinct, the equipment of foregoing description and the specific work process of module, Ke Yican
The corresponding process description in embodiment of the method is stated before examination, will not be repeated here.
Those of ordinary skill in the art with reference to the embodiments described herein it is to be appreciated that retouch
The unit and method and step for each example stated, can with electronic hardware or computer software and
The combination of electronic hardware is realized.These functions are performed with hardware or software mode actually,
Application-specific and design constraint depending on technical scheme.Professional and technical personnel can be to every
It is individual specifically to apply to realize described function using distinct methods, but this realization is not
It is considered as exceeding scope of the present application.
If the function is realized in the form of SFU software functional unit and is used as independent product pin
Sell or in use, can be stored in a computer read/write memory medium.Based on such
Understand, part that the technical scheme of the application substantially contributes to prior art in other words or
The part of person's technical scheme can be embodied in the form of software product, the computer software
Product is stored in a storage medium, including some instructions are causing a computer equipment
(can be personal computer, controller, or network equipment etc.) performs each reality of the application
Apply all or part of step of a methods described.And foregoing storage medium includes:USB flash disk, shifting
Dynamic hard disk, read-only storage (ROM, Read-Only Memory), random access memory
(RAM, Random Access Memory), magnetic disc or CD etc. are various can to store journey
The medium of sequence code.
Embodiment of above is merely to illustrate the application, and is not the limitation to the application, relevant
The those of ordinary skill of technical field, in the case where not departing from spirit and scope,
It can also make a variety of changes and modification, therefore all equivalent technical schemes fall within the application
Category, the scope of patent protection of the application should be defined by the claims.
Claims (10)
1. a kind of projective techniques, it is characterised in that methods described includes:
Determine the transparency of the medium between the image of a projection and at least eyes of user;
It is less than a threshold value in response to the transparency, reduces brightness and/or the definition of the image.
2. the method as described in claim 1, it is characterised in that methods described also includes:
It is determined that the gaze distance of at least eyes to the image.
3. method as claimed in claim 2, it is characterised in that the reduction image
Brightness and/or definition include:
According to the brightness of the transparency and the gaze distance reduction image and/or clearly
Degree.
4. method as claimed in claim 2, it is characterised in that the reduction image
Brightness and/or definition include:
According to the depth information of the image and the gaze distance, a picture of the image is determined
First simulated range of first simulation plane at least eyes described in where plain;
According to the transparency and first simulated range reduce the pixel brightness and/or
Definition.
5. method as claimed in claim 2, it is characterised in that an at least eyes include
Left eye and right eye, the image includes left-eye images corresponding to the left eye and the right eye is corresponding
Right-eye image;
Methods described also includes:Determine a left eye pixel and the right eye shadow in the left-eye images
A parallax as between a right-eye pixel, the left eye pixel and the right-eye pixel correspond to same
One composite pixel;
The brightness for reducing the image and/or definition include:According to the parallax and described
Gaze distance, the second simulation plane where determining the composite pixel to an at least eyes
Second simulated range;
The left eye pixel and described is reduced according to the transparency and second simulated range
The brightness of right-eye pixel and/or definition.
6. a kind of device for projecting, it is characterised in that the equipment includes:
One first determining module, for determine one projection image and user an at least eyes it
Between medium transparency;
One reduces module, for being less than a threshold value in response to the transparency, reduces the image
Brightness and/or definition.
7. equipment as claimed in claim 6, it is characterised in that the equipment also includes:
One second determining module, for determine an at least eyes to the image watch attentively away from
From.
8. equipment as claimed in claim 7, it is characterised in that the reduction module, be used for
Brightness and/or the definition of the image are reduced according to the transparency and the gaze distance.
A kind of 9. user equipment, it is characterised in that the user equipment include claim 6 to
Device for projecting described in 8 any one.
10. a kind of user equipment, it is characterised in that the user equipment includes:
One memory, for store instruction;
One processor, for performing the instruction of the memory storage, the instruction causes described
Operated below computing device:
Determine the transparency of the medium between the image of a projection and at least eyes of user;
It is less than a threshold value in response to the transparency, reduces brightness and/or the definition of the image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610052018.5A CN107635119B (en) | 2016-01-26 | 2016-01-26 | Projective techniques and equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610052018.5A CN107635119B (en) | 2016-01-26 | 2016-01-26 | Projective techniques and equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107635119A true CN107635119A (en) | 2018-01-26 |
CN107635119B CN107635119B (en) | 2019-05-21 |
Family
ID=61111972
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610052018.5A Active CN107635119B (en) | 2016-01-26 | 2016-01-26 | Projective techniques and equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107635119B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113052155A (en) * | 2021-02-03 | 2021-06-29 | 深圳赋能软件有限公司 | Light source brightness regulator, structured light projector, identity recognition device and electronic equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110128509A1 (en) * | 2009-11-27 | 2011-06-02 | Casio Computer Co., Ltd. | Light source device, projection apparatus, and projection method |
CN102175613A (en) * | 2011-01-26 | 2011-09-07 | 南京大学 | Image-brightness-characteristic-based pan/tilt/zoom (PTZ) video visibility detection method |
US20130083299A1 (en) * | 2011-09-30 | 2013-04-04 | Coretronic Corporation | Projector and light source controlling method thereof |
CN103680429A (en) * | 2012-08-28 | 2014-03-26 | 纬创资通股份有限公司 | Image information display device and image information display and adjustment method |
-
2016
- 2016-01-26 CN CN201610052018.5A patent/CN107635119B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110128509A1 (en) * | 2009-11-27 | 2011-06-02 | Casio Computer Co., Ltd. | Light source device, projection apparatus, and projection method |
CN102175613A (en) * | 2011-01-26 | 2011-09-07 | 南京大学 | Image-brightness-characteristic-based pan/tilt/zoom (PTZ) video visibility detection method |
US20130083299A1 (en) * | 2011-09-30 | 2013-04-04 | Coretronic Corporation | Projector and light source controlling method thereof |
CN103680429A (en) * | 2012-08-28 | 2014-03-26 | 纬创资通股份有限公司 | Image information display device and image information display and adjustment method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113052155A (en) * | 2021-02-03 | 2021-06-29 | 深圳赋能软件有限公司 | Light source brightness regulator, structured light projector, identity recognition device and electronic equipment |
CN113052155B (en) * | 2021-02-03 | 2024-04-02 | 深圳赋能光达科技有限公司 | Light source brightness regulator, structured light projector, identity recognition device and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN107635119B (en) | 2019-05-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Kruijff et al. | Perceptual issues in augmented reality revisited | |
US11379948B2 (en) | Mixed reality system with virtual content warping and method of generating virtual content using same | |
CN108780578B (en) | Augmented reality system and method of operating an augmented reality system | |
EP3959688B1 (en) | Generative latent textured proxies for object category modeling | |
US20090251460A1 (en) | Systems and methods for incorporating reflection of a user and surrounding environment into a graphical user interface | |
EP0583060A2 (en) | Method and system for creating an illusion of three-dimensionality | |
JP7201869B1 (en) | Generate new frames with rendered and unrendered content from the previous eye | |
US11113891B2 (en) | Systems, methods, and media for displaying real-time visualization of physical environment in artificial reality | |
CN110070621A (en) | Electronic device, the method and computer readable media for showing augmented reality scene | |
CN109640070A (en) | A kind of stereo display method, device, equipment and storage medium | |
US20180374258A1 (en) | Image generating method, device and computer executable non-volatile storage medium | |
US20220172319A1 (en) | Camera-based Transparent Display | |
EP3767591A1 (en) | Illumination effects from luminous inserted content | |
CN107635119A (en) | Projective techniques and equipment | |
JP6898264B2 (en) | Synthesizers, methods and programs | |
US20230396750A1 (en) | Dynamic resolution of depth conflicts in telepresence | |
US20210233313A1 (en) | Systems, Methods, and Media for Visualizing Occluded Physical Objects Reconstructed in Artificial Reality | |
KR20230097163A (en) | Three-dimensional (3D) facial feature tracking for autostereoscopic telepresence systems | |
CN111837161A (en) | Shadowing images in three-dimensional content systems | |
CN110853147A (en) | Three-dimensional face transformation method | |
Thatte | Cinematic virtual reality with head-motion parallax | |
JP3546922B2 (en) | Eyeglass lens image generation method and apparatus | |
CN116977540A (en) | Volume cloud rendering method and device, electronic equipment and storage medium | |
Hassaine | Efficient rendering for three-dimensional displays | |
Alhashim | Depth of field simulation in computer graphics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |