CN107025683A - A kind of information processing method and electronic equipment - Google Patents
A kind of information processing method and electronic equipment Download PDFInfo
- Publication number
- CN107025683A CN107025683A CN201710203896.7A CN201710203896A CN107025683A CN 107025683 A CN107025683 A CN 107025683A CN 201710203896 A CN201710203896 A CN 201710203896A CN 107025683 A CN107025683 A CN 107025683A
- Authority
- CN
- China
- Prior art keywords
- shade
- determined
- light
- environmental information
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/506—Illumination models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/60—Shadow generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/80—Shading
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
It is weaker and lead to not the technical problem that carries out enhancing processing to realities of the day scene exactly to solve instant reproduction when electronic equipment is strengthened real scene using AR technologies the invention discloses a kind of information processing method and electronic equipment.Methods described includes:Obtain the real scene image for the outdoor scene scene for including the first object;Obtain the environmental information and virtual objects model corresponding with first object of the first object local environment;The virtual objects model is handled according to the environmental information, to generate the corresponding AR real scene images of the outdoor scene scene.
Description
Technical field
The present invention relates to field of computer technology, more particularly to a kind of information processing method and electronic equipment.
Background technology
With the development of electronic technology, a kind of enhancing is employed in the visual experience of user, electronic equipment now in order to strengthen
Real technology (Augmented Reality, AR), the technology can enter designed dummy model with the information in real world
Row supplement, superposition, and then simultaneously include virtual information and real information in same picture, i.e., by virtual objects and true field
Scape reformulates the image for possessing vision true to nature, tactile, realizes the natural interaction of user and environment, brings a kind of new sense
Official experiences.
At present, when being included virtual information and real information in same picture using AR technologies, such as by a building house
Three dimensional virtual models and current shooting house map picture carry out enhancing display when, be to directly read pre-designed house
Three dimensional virtual models, then will again clapped by AR technologies after the three dimensional virtual models in house and current shooting picture synthesis again
Take the photograph in picture and show.
It can be seen that, it is that the augmented reality to reality scene is realized by the three dimensional virtual models prestored in the prior art, and
The three dimensional virtual models prestored are changeless, and its applicability is very low when for different display scenes, such as current
At the moment, captured house is in itself because the factors such as direction of illumination can have a shade, but the three-dimensional that electronic equipment is read
Dummy model may then be not present shade or shadow positions and current light direction and mismatch, and then cause obtained AR
Picture and current environment are simultaneously mismatched, that is to say, that electronic equipment using AR technologies real scene is carried out it is enhanced immediately again
Existing ability is weaker, it is impossible to carry out enhancing processing to realities of the day scene exactly, it is impossible to bring preferable vision to imitate for user
Really.
The content of the invention
The embodiment of the present invention provides a kind of information processing method and electronic equipment, to solve electronic equipment using AR technologies
Instant reproduction when strengthening real scene is weaker and leads to not exactly increase realities of the day scene
The technical problem of strength reason.
First aspect there is provided a kind of information processing method method, including:
Obtain the real scene image for the outdoor scene scene for including the first object;
Obtain the environmental information and virtual objects model corresponding with first object of the first object local environment;
The virtual objects model is handled according to the environmental information, to generate the corresponding AR of the outdoor scene scene
Real scene image.
In a kind of possible implementation, the virtual objects model is handled according to the environmental information, with
The corresponding AR real scene images of the outdoor scene scene are generated, including:
The virtual shadow model of the shade of first object is determined according to the environmental information, and/or, according to the ring
Environment information determines the light and shade scale model of first object;Wherein, the light and shade scale model is used to characterize described first pair
As the light and shade difference of each several part in display;
Based on the virtual objects model, and the virtual shadow model and/or the light and shade scale model, institute is generated
State the AR object informations of the first object;
The image information of the first object described in the real scene image is replaced with the AR object informations, to obtain the AR
Real scene image.
In a kind of possible implementation, virtual the moon of the shade of first object is determined according to the environmental information
Shadow model, including:
According to the environmental information, the shadow region occupied by the shade of first object is determined;
According to the shape and size of the shadow region, the virtual shadow model is determined.
In a kind of possible implementation, according to the environmental information, occupied by the shade for determining first object
Shadow region, including:
According to the environmental information, the position where the light source in the first object local environment is determined;
The relative position relation between position and first object according to where the light source, determines described first pair
Direction and angle where the shade of elephant;
Shadow region described in direction and angle-determining according to where the shade of first object.
In a kind of possible implementation, direction and angle according to where the shade of first object determine institute
Shadow region is stated, including:
According to the environmental information, the second object between first object and the light source is determined;
According to the relative position relation between first object, second object and the light source, described is determined
The light shielded area that two objects are caused to first object;
Direction and angle and the light shielded area according to where the shade of first object determine described the moon
Shadow zone domain.
In a kind of possible implementation, the light and shade of each several part of first object is determined according to the environmental information
Scale model, including:
According to the environmental information, the light of relatively described first object of light source in the first object local environment is determined
According to direction;
According to the direction of illumination, reflected intensity ratio of each several part to light of first object is determined;
According to the reflected intensity ratio, the light and shade scale model is determined.
In a kind of possible implementation, according to the reflected intensity ratio, the light and shade scale model is determined, is wrapped
Include:
According to the intensity of illumination of the light source, the display brightness ratio of each several part of first object is determined;
According to the reflected intensity ratio and the display brightness ratio, the light and shade scale model is determined.
Second aspect there is provided a kind of electronic equipment, including:
Sensor, for obtaining environmental information;
Processor, is connected with the sensor, the real scene image for obtaining the real scene image for including the first object;Obtain
The environmental information of the first object local environment and virtual objects model corresponding with first object;And according to described
Environmental information is handled the virtual objects model, to generate the corresponding AR real scene images of the outdoor scene scene.
In a kind of possible implementation, the processor enters according to the environmental information to the virtual objects model
Row processing, to generate the corresponding AR real scene images of the outdoor scene scene, including:
The virtual shadow model of the shade of first object is determined according to the environmental information, and/or, according to the ring
Environment information determines the light and shade scale model of first object;Wherein, the light and shade scale model is used to characterize described first pair
As the light and shade difference of each several part in display;
Based on the virtual objects model, and the virtual shadow model and/or the light and shade scale model, institute is generated
State the AR object informations of the first object;
It is described to obtain with the image information that the first object described in the real scene image is replaced with the AR object informations
AR real scene images.
In a kind of possible implementation, the processor determines the moon of first object according to the environmental information
The virtual shadow model of shadow, including:
According to the environmental information, the shadow region occupied by the shade of first object is determined;
According to the shape and size of the shadow region, the virtual shadow model is determined.
In a kind of possible implementation, the processor determines first object according to the environmental information
Shadow region occupied by shade, including:
According to the environmental information, the position where the light source in first local environment is determined;
The relative position relation between position and first object according to where the light source, determines described first pair
Direction and angle where the shade of elephant;
Shadow region described in direction and angle-determining according to where the shade of first object.
In a kind of possible implementation, direction and angle of the processor according to where the shade of first object
Degree, determines the shadow region, including:
According to the environmental information, the second object between first object and the light source is determined;
According to the relative position relation between first object, second object and the light source, described is determined
The light shielded area that two objects are caused to first object;
Direction and angle and the light shielded area according to where the shade of first object determine described the moon
Shadow zone domain.
In a kind of possible implementation, the processor determines the bright of first object according to the environmental information
Dark scale model, including:
According to the environmental information, the light of relatively described first object of light source in the first object local environment is determined
According to direction;
According to the direction of illumination, reflected intensity ratio of each several part to light of first object is determined;
According to the reflected intensity ratio, the light and shade scale model is determined.
In a kind of possible implementation, the processor determines the light dark ratio according to the reflected intensity ratio
Example model, including:
According to the intensity of illumination of the light source, the display brightness ratio of each several part of first object is determined;
According to the reflected intensity ratio and the display brightness ratio, the light and shade scale model is determined.
The third aspect there is provided another electronic equipment, including:
First obtains module, the real scene image for obtaining the outdoor scene scene for including the first object;
Second obtain module, for obtain the first object local environment environmental information and with first object pair
The virtual objects model answered;
AR processing modules, it is described to generate for being handled according to the environmental information the virtual objects model
The corresponding AR real scene images of outdoor scene scene.
In the embodiment of the present invention, after acquisition includes the real scene image of outdoor scene scene of the first object, it can obtain again
The environmental information of first object local environment and virtual objects model corresponding with the first object, and then further according to environmental information
Virtual objects model is handled to generate the corresponding AR real scene images of outdoor scene scene, relative in the prior art directly to right
The virtual objects model of elephant is carried out for the mode of AR processing, and the embodiment of the present invention is handled to obtain AR real scene images in progress AR
When also the environmental information of the first object local environment is also contemplated for into, i.e., can be according to environmental information by the virtual right of the first object
As model is first handled, AR real scene images are generated further according to treated virtual objects model afterwards, that is to say, that carrying out
Real-time environmental information can so be increased into virtual objects model and reality scene as a processing variable during AR processing
Instant degrees of fusion, to improve instant reproduction when electronic equipment strengthens real scene, and then is improved to realities of the day
Scene carries out the accuracy and accuracy of enhancing processing, so as to strengthen the visual experience of user.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing
There is the accompanying drawing used required in technology description to be briefly described, it should be apparent that, drawings in the following description are only this
Inventive embodiments, for those of ordinary skill in the art, on the premise of not paying creative work, can also be according to carrying
The accompanying drawing of confession obtains other accompanying drawings.
Fig. 1 is the flow chart of information processing method in the embodiment of the present invention;
Fig. 2 is the structural representation of electronic equipment in the embodiment of the present invention;
Fig. 3 is the structured flowchart of electronic equipment in the embodiment of the present invention.
Embodiment
For the object, technical solutions and advantages of the present invention are more clearly understood, below in conjunction with the embodiment of the present invention
Accompanying drawing, the technical scheme in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is only
It is a part of embodiment of the invention, rather than whole embodiments.Based on the embodiment in the present invention, ordinary skill people
The every other embodiment that member is obtained under the premise of creative work is not made, belongs to the scope of protection of the invention.
In the case of not conflicting, the embodiment in the present invention and the feature in embodiment can be mutually combined.And, although in stream
Logical order is shown in journey figure, but in some cases, can be shown or described to be performed different from order herein
The step of.
In addition, the terms "and/or", only a kind of incidence relation for describing affiliated partner, represents there may be
Three kinds of relations, for example, A and/or B, can be represented:Individualism A, while there is A and B, these three situations of individualism B.Separately
Outside, character "/" herein, in the case where not illustrating, it is a kind of relation of "or" to typically represent forward-backward correlation object.
In order to be better understood from above-mentioned technical proposal, below in conjunction with Figure of description and specific embodiment to upper
Technical scheme is stated to be described in detail.
Information processing method in the embodiment of the present invention can apply to such as mobile phone, tablet personal computer (PAD), palm PC
(Personal Digital Assistant, PDA), notebook computer, PC (PC), intelligent glasses, intelligent helmet etc.
Electronic equipment, as long as the electronic equipment has AR functions, which kind of equipment is the embodiment of the present invention be specially for electronic equipment
It is not limited.
The flow for referring to the information processing method in Fig. 1, the embodiment of the present invention is described as follows.
Step 101:Obtain the real scene image for the outdoor scene scene for including the first object.
First, electronic equipment can obtain the image of an outdoor scene scene, and for the ease of description, the image of outdoor scene scene is claimed
Make real scene image, due to including the first object in the outdoor scene scene, so naturally also just including the in obtained real scene image
The image information of one object.
For example, outdoor scene scene is to include a building house either scene in many building houses or be to include a cup
The scene on desk is positioned over desk and cup, or scene that child is walking, etc. is had in arms for a woman, total
For, as long as any outdoor scene scene that can be shot for image can be regarded as the outdoor scene in the embodiment of the present invention
Scape.
In specific implementation process, electronic equipment can be by the imaging sensor (such as camera) of electronic equipment itself
Shoot the real scene image for obtaining outdoor scene scene, or the outdoor scene scene that can also be shot and send by receiving other electronic equipments
Real scene image, etc..More preferably, in order to carry out AR enhancing processing to true environment residing for electronic equipment, the present invention is real
Applying the real scene image in example can immediately be obtained by the imaging sensor of electronic equipment itself.
Step 102:Obtain the environmental information and virtual objects model corresponding with the first object of the first object local environment.
After the real scene image of the outdoor scene scene including the first object is obtained, the first object can be obtained again corresponding
Virtual objects model, and the virtual objects model of the first object can be prestored in the electronic device, i.e. electronic equipment
A model library can be previously stored with, such as when the first object is house, then can correspond to and room is read out from model library
Sub- threedimensional model, and when the first object is cup, then can correspond to and cup threedimensional model, etc. is read out from model library.
The threedimensional model of different objects in the model library prestored is initial threedimensional model, and in order to suitable for big portion
The real enhanced scene divided, initial threedimensional model is typically merely able to embody profile and the moulding of object base, and for example in difference
Shade or lighting effect produced by illumination condition can not then be embodied by initial threedimensional model, that is to say, that initial
Threedimensional model can only show that the object is specifically what, e.g. one cat, a cup, an a computer either middle age
Man, etc., and other additional informations caused by true environment can not be embodied, the moon for example produced due to illumination
Shadow, either due in face of strong wind walk and cause hair float in the air backward or due to barrier partial occlusion and cause object
A part it is very dark, etc..Or, initial threedimensional model can represent other additional informations, but be default value, due to solid
It is fixed constant, so can not also embody the real-time change of real scene, it is impossible to reach instant enhancing treatment effect.
So, after the real scene image of the outdoor scene scene including the first object is obtained, in the embodiment of the present invention not only
The virtual objects model of the first object is obtained, the environmental information that the first object is presently in environment can be also obtained, so as to basis
Virtual objects model is further processed obtained environmental information, with increase virtual objects model and reality scene i.e.
When degrees of fusion so that the AR processing for the first object can try one's best with outdoor scene scene and be consistent, to lift the sense of reality as far as possible.
In addition, the first object be presently in environment can be shoot the first object image information when the first object residing for
Environment, persistently carried out because the shooting for the first object is possible, that is, being to carry out video flowing to first object
Shooting, so the environment residing for the first object may change with the process of shooting, such as in the first shooting time,
The intensity of illumination of one object local environment is 4nit, and in the second shooting time, the intensity of illumination of the first object local environment is then
It is reduced to 2.2nit.
Step 103:Virtual objects model is handled according to environmental information, to generate the corresponding AR outdoor scenes of outdoor scene scene
Image.
That is, just can be with after the virtual objects model and the real time environment information of the first object for obtaining the first object
Virtual objects model is handled according to environmental information, and then generates the corresponding AR real scene images of outdoor scene scene.Compared to existing
Have in technology directly for the virtual objects model of object carries out the mode of AR processing, the embodiment of the present invention is carrying out AR processing
Also the environmental information of the first object local environment is also contemplated for into during AR real scene images with obtaining, i.e., can be incited somebody to action according to environmental information
The virtual objects model of first object is first handled, and generates AR realistic pictures further according to treated virtual objects model afterwards
Picture, that is to say, that when carrying out AR processing using real-time environmental information as a processing variable, can so increase virtual right
As model and the instant degrees of fusion of reality scene, to improve instant reproduction when electronic equipment strengthens real scene, enter
And improve and realities of the day scene is carried out to strengthen the accuracy and accuracy of processing, so as to strengthen the visual experience of user.
In specific implementation process, the virtual objects model of the first object can be carried out according to the environmental information obtained
A variety of processing, and then the AR object informations of the first object can be obtained according to different results, then replaced with AR object informations
The image information of the first object in real scene image is changed to obtain the AR real scene images of outdoor scene scene.
Because the processing of the virtual objects model progress to the first object is different, then it may correspond to and obtain outdoor scene scene not
, certainly, can be to the full extent to real scene in order that obtaining obtained AR real scene images with the AR real scene images of effect
Reproduced, diversified processing of trying one's best should be carried out to virtual objects model according to environmental information, for the ease of this area skill
Art personnel understand, being exemplified below two kinds influences larger situation on the true reappearances of AR real scene images, but people in the art
Member is proposed based on the embodiment of the present invention it should be appreciated that the embodiment of the present invention includes but is not limited to two following situations
The thought handled based on environmental information virtual objects model under, other processing modes can also be derived.
The first situation:
The virtual shadow model of the shade of the first object is determined according to environmental information, then based on virtual objects model and virtually
Shadow model generates the AR object informations of the first object.
That is, what the first situation illustrated is the shade that the first object is determined according to environmental information, because in reality
In, the object in real scene typically can all have shade, such as irradiation or the irradiation of indoor light due to sunlight, object
Shade will be produced, and due to the influence of the factors such as direction of illumination, the position of the shade of same object may not be used also in the same time
It can vary widely, so in order to embody the shadow condition that the first object is current exactly, can be first according to environment
Information determines the shadow region that the shade of the first object should be occupied, further according to the shape and size of identified shadow region
Virtual shadow model is determined, is finally again superimposed shade dummy model with object dummy model, and then shows the first object
And its in the shade at current time.
In specific implementation process, the light source institute that the first object is presently in environment first can be determined according to environmental information
Position, the shade institute of the first object is determined further according to the relative position relation between the position where light source and the first object
Direction and angle, and then direction according to where the shade of the first object and angle-determining go out the shade of the first object should
The shadow region occupied.
For example, have the women of a standing in the street, and now the sun is located at the upper right side of the women, is propagated according to light
Principle is then understood, should be now located substantially on the ground of the lower left of the women, then may be used corresponding to the human body shade of the women
So that the ground region of the women lower left to be defined as to the shadow region that the shade of the women should be occupied at current time, and
How much area should be specifically occupied as the shadow region, according to the specific relative position between light source and the women and can be somebody's turn to do
Depending on the appearance profile of women.
In addition, it is also contemplated that such a actual conditions, also there are other objects between the first object and light source, for example
Also there is the second object, and the second object is caused to the light that the first object is pointed into by light source and blocked, now can be by second
Object regards the shelter of the first object as, due to blocking for the second object so that the light being irradiated on the first object is reduced,
The shadow outline and shaded area of the first object may be now caused to change, for example shaded area is reduced, so being
The TP and area of the now shade of the first object are accurately determined, now can be according to the first object, the second object
Relative position relation between light source this three, first determines the second object to the light block surface caused by the first object
Product, and then calculate the shade of final reality with initially identified shadow region further according to the light shielded area determined
Region.
That is, it is determined that the first object shadow region when, in the case where considering actual illumination condition, should also be considered the
The influence of existing shelter between one object and light source, this is the situation of generally existing in practice, it is contemplated that shelter
Influence make it that the shade for the first object is determined to more accurate, and adaptability is also very strong, beneficial to practical application
And popularization.
In practice, sensor that can be in electronic equipment determines that light source is big according to detected illumination power
Position is caused, the intensity of illumination for the light for example injected on the left of sensor will be much smaller than the illumination for the light injected from right side
Intensity, then it is considered that the position of light source is located at the upper right side of electronic equipment.Or, can first obtain current weather conditions is
It is sunny, and be now noon 12:00, then it can determine that the current sun is located substantially at the surface of object, shade now
Should be from top to bottom close to vertical, so shaded area now should be minimum, for instance it can be possible that with occupied by object itself
Region close to overlap.Or, captured image can be identified for electronic equipment, to identify light source and the first object
Respectively where position, and then can according to both respectively where position determine what the shade of the first object should be located at
Place, etc..
Second case:
The light and shade scale model of the first object is determined according to environmental information, then based on virtual objects model and light and shade proportional die
Type generates the AR object informations of the first object.
Wherein, light and shade scale model is used for the light and shade difference for characterizing the first object each several part in display, that is to say, that the
What two kinds of situations illustrated is that the bright-dark degree of the first object each several part when being shown can be determined according to environmental information.For example
If light source is on the right side of cup, then the light that the right half part of cup is reflected for left-half is with regard to more one
A bit, so can be in the first object certainly so in display, right half part brightness for left-half is just some larger
Light and shade just can be presented with it interval, and the bright-dark degree presented with the difference of direction of illumination, the first object each several part
It can change therewith, it is possible to relative first object of light source in the first object local environment is first determined according to environmental information
Direction of illumination, reflected intensity ratio of each several part to light of the first object is determined further according to direction of illumination, is finally based on again
Identified reflected intensity ratio obtains light and shade scale model to calculate.
In addition, on the premise of direction of illumination is considered, intensity of illumination can also be considered simultaneously, i.e., can also be according to light source
Intensity of illumination determines the display brightness ratio of each several part of the first object, such as above-mentioned example, when light source is on the right side of the first object
During side, the right half part of the first object can be brighter, but specific bright to which kind of degree, now then can be according to intensity of illumination
To determine, further, acquisition light and shade scale model is calculated further according to both reflected intensity ratio and display brightness ratio.
Direction of illumination and intensity of illumination are taken into account simultaneously, can make it that the calculating for light and shade scale model is more smart
Really, so practical situation can as often as possible be taken into account, more accurately to determine each several part in the first object current
Bright-dark degree when being shown under environment, to environment true reappearance, improves the visual experience of user.
, can be with order to be reproduced to the full extent to real scene in alternatively possible embodiment
The first above-mentioned situation and second case are implemented simultaneously, the first situation is can be understood as specific embodiment
Combination with second case is implemented, and is just not repeated to illustrate herein.
Fig. 2 is refer to, based on same inventive concept, the embodiment of the present invention provides a kind of electronic equipment, electronic equipment example
It can such as be set for mobile phone, tablet personal computer, palm PC, notebook computer, PC, intelligent glasses, intelligent helmet electronics
It is standby, as long as the electronic equipment has AR functions.The electronic equipment includes sensor 201 and processor 202.Wherein:
Sensor 201, for obtaining environmental information;
Processor 202, is connected with sensor 201, the real scene image for obtaining the real scene image for including the first object;Obtain
Obtain the environmental information and virtual objects model corresponding with the first object of the first object local environment;And according to environmental information pair
Virtual objects model is handled, to generate the corresponding augmented reality AR real scene images of outdoor scene scene.
Sensor 201 can be light sensor, RGB sensors, imaging sensor etc., according to environmental information not
Together, sensor 201 can be different types of sensor, in specific implementation process can as needed depending on, herein just not
Have been illustrated one by one.
Processor 202 can be specifically general central processing unit (CPU), or can be ASIC
(Application Specific Integrated Circuit, ASIC), or can be it is one or more be used for control journey
Integrated circuit that sequence is performed, etc..
In addition, electronic equipment can also include display screen, and display screen can refer to light emitting diode (Light
Emitting Diode, LED) display screen, Organic Light Emitting Diode (Organic Light Emitting Diode, OLED) show
Display screen, active matrix organic light-emitting diode (Active Matrix Organic Light Emitting Diode,
AMOLED) display screen, plane conversion (In-Plane Switching, IPS) display screen, etc., display screen can have a plurality of
Side, and a plurality of side can constitute the shape of class rectangle.
Further, electronic equipment can also include memory, and the quantity of memory can be one or more.Memory
Can include read-only storage (Read Only Memory, ROM), random access memory (Random Access Memory,
) or magnetic disk storage, etc. RAM.
Memory and processor 202 can be connected by bus, or can also be connected by special connecting line, storage
Device is used for store instruction, and processor 202 is used to perform the instruction that memory is stored, can be performed in execute instruction as before
The step included by information processing method stated.
In a kind of possible embodiment, processor 202 is handled virtual objects model according to environmental information, with
The corresponding AR real scene images of outdoor scene scene are generated, can be included:
The virtual shadow model of the shade of the first object is determined according to environmental information, and/or, determine according to environmental information
The light and shade scale model of one object;Wherein, light and shade scale model is poor for the light and shade for characterizing the first object each several part in display
It is different;
Based on virtual objects model, and virtual shadow model and/or light and shade scale model, AR pairs of the first object of generation
Image information;
With the image information that the first object in real scene image is replaced with AR object informations, to obtain AR real scene images.
In a kind of possible embodiment, processor 202 determines the virtual of the shade of the first object according to environmental information
Shadow model, can include:
According to environmental information, the shadow region occupied by the shade of the first object is determined;
According to the shape and size of shadow region, virtual shadow model is determined.
In a kind of possible embodiment, processor 202 is according to environmental information, occupied by the shade for determining the first object
Shadow region, can include:
According to environmental information, the position where the light source in first local environment is determined;
Where the relative position relation between position and the first object according to where light source, the shade for determining the first object
Direction and angle;
Direction and angle-determining shadow region according to where the shade of the first object.
In a kind of possible embodiment, direction and angle of the processor 202 according to where the shade of the first object, really
Determine shadow region, can include:
According to environmental information, the second object between the first object and light source is determined;
According to the relative position relation between the first object, the second object and light source, determine the second object to the first object
The light shielded area caused;
Direction and angle and light shielded area according to where the shade of the first object determine shadow region.
In a kind of possible embodiment, processor 202 determines the light and shade proportional die of the first object according to environmental information
Type, can include:
According to environmental information, direction of illumination of the light source with respect to the first object in the first object local environment is determined;
According to direction of illumination, reflected intensity ratio of each several part to light of the first object is determined;
According to reflected intensity ratio, light and shade scale model is determined.
In a kind of possible embodiment, processor 202 determines light and shade scale model according to reflected intensity ratio, can
With including:
According to the intensity of illumination of light source, the display brightness ratio of each several part of the first object is determined;
According to reflected intensity ratio and display brightness ratio, light and shade scale model is determined.
In embodiments of the present invention, can be by foregoing information processing method by being designed programming to processor 202
Corresponding code is cured in chip, so that chip is operationally able to carry out included by foregoing information processing method
How step, be designed to processor 202 and be programmed for technology known in those skilled in the art, repeat no more here.
Fig. 3 is refer to, based on same inventive concept, the embodiment of the present invention provides another electronic equipment, the electronic equipment
Module 301, second, which is obtained, including first obtains module 302 and AR processing modules 303, and first in the embodiment of the present invention is obtained
Correlation function can be realized by hardware processor by obtaining the acquisition module 302 of module 301, second and AR processing modules 303.
Wherein:
First obtains module 301, the real scene image for obtaining the outdoor scene scene for including the first object;
Second obtains module 302, for obtaining the environmental information of the first object local environment and corresponding with the first object
Virtual objects model;
AR processing modules 303, for being handled according to environmental information virtual objects model, to generate outdoor scene scene pair
The augmented reality AR real scene images answered.
In a kind of possible embodiment, AR processing modules 303 can be used for:
The virtual shadow model of the shade of the first object is determined according to environmental information, and/or, determine according to environmental information
The light and shade scale model of one object;Wherein, light and shade scale model is poor for the light and shade for characterizing the first object each several part in display
It is different;
Based on virtual objects model, and virtual shadow model and/or light and shade scale model, AR pairs of the first object of generation
Image information;
The image information of the first object in real scene image is replaced with AR object informations, to obtain AR real scene images.
In a kind of possible embodiment, AR processing modules 303 determine the shade of the first object according to environmental information
Virtual shadow model, can include:
According to environmental information, the shadow region occupied by the shade of the first object is determined;
According to the shape and size of shadow region, virtual shadow model is determined.
In a kind of possible embodiment, AR processing modules 303 determine the shade institute of the first object according to environmental information
The shadow region occupied, can include:
According to environmental information, the position where the light source in the first object local environment is determined;
Where the relative position relation between position and the first object according to where light source, the shade for determining the first object
Direction and angle;
Direction and angle-determining shadow region according to where the shade of the first object.
In a kind of possible embodiment, direction and angle of the AR processing modules 303 according to where the shade of the first object
Degree, determines shadow region, can include:
According to environmental information, the second object between the first object and light source is determined;
According to the relative position relation between the first object, the second object and light source, determine the second object to the first object
The light shielded area caused;
Direction and angle and light shielded area according to where the shade of the first object determine shadow region.
In a kind of possible embodiment, AR processing modules 303 determine each several part of the first object according to environmental information
Light and shade scale model, can include:
According to environmental information, direction of illumination of the light source with respect to the first object in the first object local environment is determined;
According to direction of illumination, reflected intensity ratio of each several part to light of the first object is determined;
According to reflected intensity ratio, light and shade scale model is determined.
In a kind of possible embodiment, AR processing modules 303 determine light and shade proportional die according to reflected intensity ratio
Type, can include:
According to the intensity of illumination of light source, the display brightness ratio of each several part of the first object is determined;
According to reflected intensity ratio and display brightness ratio, light and shade scale model is determined.
Various change mode and instantiation in information processing method in earlier figures 1 are equally applicable in the present embodiment
Electronic equipment, by the foregoing detailed description to information processing method, those skilled in the art are clear that this reality
The implementation of electronic equipment in example is applied, therefore the implementation of electronic equipment may refer to above- mentioned information processing in the embodiment of the present invention
The implementation of method, it is succinct for specification, it will not be described in detail herein.
It is apparent to those skilled in the art that, for convenience and simplicity of description, only with above-mentioned each function
The division progress of module is for example, in practical application, as needed can distribute above-mentioned functions by different function lists
Member is completed, i.e., the internal structure of device is divided into different functional units, to complete all or part of work(described above
Energy.The specific work process of the system, apparatus, and unit of foregoing description, may be referred to corresponding in preceding method embodiment
Journey, will not be repeated here.
In several embodiments provided by the present invention, it should be understood that disclosed system, apparatus and method can be with
Realize by another way.For example, device embodiment described above is only schematical, for example, the module or
The division of unit, only a kind of division of logic function, can there is other dividing mode when actually realizing, such as multiple units
Or component can combine or be desirably integrated into another system, or some features can be ignored, or not perform.It is another, institute
Display or the coupling each other discussed or direct-coupling or communication connection can be by some interfaces, device or unit
INDIRECT COUPLING or communication connection, can be electrical, machinery or other forms.
The unit illustrated as separating component can be or may not be it is physically separate, it is aobvious as unit
The part shown can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple
On NE.Some or all of unit therein can be selected to realize the mesh of this embodiment scheme according to the actual needs
's.
In addition, each functional unit in each embodiment of the invention can be integrated in a processing unit, can also
That unit is individually physically present, can also two or more units it is integrated in a unit.Above-mentioned integrated list
Member can both be realized in the form of hardware, it would however also be possible to employ the form of SFU software functional unit is realized.
If the integrated unit is realized using in the form of SFU software functional unit and as independent production marketing or used
When, it can be stored in a computer read/write memory medium.Understood based on such, technical scheme is substantially
The part contributed in other words to prior art or all or part of the technical scheme can be in the form of software products
Embody, the computer software product is stored in a storage medium, including some instructions are to cause a computer
Equipment (can be personal computer, server, or network equipment etc.) or processor (processor) perform the present invention each
The all or part of step of embodiment methods described.And foregoing storage medium includes:USB flash disk, mobile hard disk, ROM, RAM, magnetic disc
Or CD etc. is various can be with the medium of store program codes.
Specifically, the corresponding computer program instructions of a kind of information processing method in the embodiment of the present invention can be deposited
Storage is on the storage mediums such as CD, hard disk, USB flash disk, when computer program corresponding with a kind of information processing refers in storage medium
When order is read or is performed by electronic equipment, comprise the following steps:
Obtain the real scene image for the outdoor scene scene for including the first object;
Obtain the environmental information and virtual objects model corresponding with the first object of the first object local environment;
Virtual objects model is handled according to environmental information, to generate the corresponding augmented reality AR outdoor scenes of outdoor scene scene
Image.
Alternatively, stored in the storage medium and step:Virtual objects model is handled according to environmental information,
To generate the corresponding AR real scene images of outdoor scene scene, corresponding computer program instructions when executed, specifically include following step
Suddenly:
The virtual shadow model of the shade of the first object is determined according to environmental information, and/or, determine according to environmental information
The light and shade scale model of one object;Wherein, light and shade scale model is poor for the light and shade for characterizing the first object each several part in display
It is different;
Based on virtual objects model, and virtual shadow model and/or light and shade scale model, AR pairs of the first object of generation
Image information;
The image information of the first object in real scene image is replaced with AR object informations, to obtain AR real scene images.
Alternatively, stored in the storage medium and step:The void of the shade of the first object is determined according to environmental information
Intend shadow model, corresponding computer program instructions when executed, specifically include following steps:
According to environmental information, the shadow region occupied by the shade of the first object is determined;
According to the shape and size of shadow region, virtual shadow model is determined.
Alternatively, stored in the storage medium and step:According to environmental information, shared by the shade for determining the first object
According to shadow region, corresponding computer program instructions when executed, specifically include following steps:
According to environmental information, the position where the light source in the first object local environment is determined;
Where the relative position relation between position and the first object according to where light source, the shade for determining the first object
Direction and angle;
Direction and angle-determining shadow region according to where the shade of the first object.
Alternatively, stored in the storage medium and step:Direction and angle according to where the shade of the first object,
Shadow region is determined, corresponding computer program instructions when executed, are specifically included as follows according to the environmental information, it is determined that
The second object between first object and the light source;
According to the relative position relation between the first object, the second object and light source, determine the second object to the first object
The light shielded area caused;
Direction and angle and light shielded area according to where the shade of the first object determine shadow region.
Alternatively, stored in the storage medium and step:The each several part of the first object is determined according to environmental information
Light and shade scale model, corresponding computer program instructions when executed, specifically include following steps:
According to environmental information, direction of illumination of the light source with respect to the first object in the first object local environment is determined;
According to direction of illumination, reflected intensity ratio of each several part to light of the first object is determined;
According to reflected intensity ratio, light and shade scale model is determined.
Alternatively, stored in storage medium and step:According to reflected intensity ratio, light and shade scale model is determined, correspondence
Computer program instructions when executed, specifically include following steps:
According to the intensity of illumination of light source, the display brightness ratio of each several part of the first object is determined;
According to reflected intensity ratio and display brightness ratio, light and shade scale model is determined.
Although having been described for the preferred embodiment of the application, those skilled in the art once know basic creation
Property concept, then can make other change and modification to these embodiments.So, appended claims are intended to be construed to include excellent
Select embodiment and fall into having altered and changing for the application scope.
Obviously, those skilled in the art can carry out the essence of various changes and modification without departing from the application to the application
God and scope.So, if these modifications and variations of the application belong to the scope of the application claim and its equivalent technologies
Within, then the application is also intended to comprising including these changes and modification.
Claims (14)
1. a kind of information processing method, including:
Obtain the real scene image for the outdoor scene scene for including the first object;
Obtain the environmental information and virtual objects model corresponding with first object of the first object local environment;
The virtual objects model is handled according to the environmental information, it is existing to generate the corresponding enhancing of the outdoor scene scene
Real AR real scene images.
2. the method as described in claim 1, it is characterised in that carried out according to the environmental information to the virtual objects model
Processing, to generate the corresponding AR real scene images of the outdoor scene scene, including:
The virtual shadow model of the shade of first object is determined according to the environmental information, and/or, believed according to the environment
Breath determines the light and shade scale model of first object;Wherein, the light and shade scale model exists for characterizing first object
The light and shade difference of each several part during display;
Based on the virtual objects model, and the virtual shadow model and/or the light and shade scale model, described the is generated
The AR object informations of one object;
The image information of the first object described in the real scene image is replaced with the AR object informations, to obtain the AR outdoor scenes
Image.
3. method as claimed in claim 2, it is characterised in that the shade of first object is determined according to the environmental information
Virtual shadow model, including:
According to the environmental information, the shadow region occupied by the shade of first object is determined;
According to the shape and size of the shadow region, the virtual shadow model is determined.
4. method as claimed in claim 3, it is characterised in that according to the environmental information, determine the moon of first object
Shadow region occupied by shadow, including:
According to the environmental information, the position where the light source in the first object local environment is determined;
The relative position relation between position and first object according to where the light source, determines first object
Direction and angle where shade;
Shadow region described in direction and angle-determining according to where the shade of first object.
5. method as claimed in claim 4, it is characterised in that direction and angle according to where the shade of first object
Degree, determines the shadow region, including:
According to the environmental information, the second object between first object and the light source is determined;
According to the relative position relation between first object, second object and the light source, described second pair is determined
As the light shielded area caused to first object;
Direction and angle and the light shielded area according to where the shade of first object determine the shadow region
Domain.
6. method as claimed in claim 2, it is characterised in that each portion of first object is determined according to the environmental information
The light and shade scale model divided, including:
According to the environmental information, the illumination side of relatively described first object of light source in the first object local environment is determined
To;
According to the direction of illumination, reflected intensity ratio of each several part to light of first object is determined;
According to the reflected intensity ratio, the light and shade scale model is determined.
7. method as claimed in claim 6, it is characterised in that according to the reflected intensity ratio, determines the light and shade ratio
Model, including:
According to the intensity of illumination of the light source, the display brightness ratio of each several part of first object is determined;
According to the reflected intensity ratio and the display brightness ratio, the light and shade scale model is determined.
8. a kind of electronic equipment, including:
Sensor, for obtaining environmental information;
Processor, is connected with the sensor, the real scene image for obtaining the real scene image for including the first object;Obtain described
The environmental information of first object local environment and virtual objects model corresponding with first object;And according to the environment
Information is handled the virtual objects model, to generate the corresponding augmented reality AR real scene images of the outdoor scene scene.
9. electronic equipment as claimed in claim 8, it is characterised in that the processor is according to the environmental information to the void
Intend object model to be handled, to generate the corresponding AR real scene images of the outdoor scene scene, including:
The virtual shadow model of the shade of first object is determined according to the environmental information, and/or, believed according to the environment
Breath determines the light and shade scale model of first object;Wherein, the light and shade scale model exists for characterizing first object
The light and shade difference of each several part during display;
Based on the virtual objects model, and the virtual shadow model and/or the light and shade scale model, described the is generated
The AR object informations of one object;
It is real to obtain the AR with the image information that the first object described in the real scene image is replaced with the AR object informations
Scape image.
10. electronic equipment as claimed in claim 9, it is characterised in that the processor determines institute according to the environmental information
The virtual shadow model of the shade of the first object is stated, including:
According to the environmental information, the shadow region occupied by the shade of first object is determined;
According to the shape and size of the shadow region, the virtual shadow model is determined.
11. electronic equipment as claimed in claim 10, it is characterised in that the processor according to the environmental information, it is determined that
Shadow region occupied by the shade of first object, including:
According to the environmental information, the position where the light source in first local environment is determined;
The relative position relation between position and first object according to where the light source, determines first object
Direction and angle where shade;
Shadow region described in direction and angle-determining according to where the shade of first object.
12. electronic equipment as claimed in claim 11, it is characterised in that the processor is according to the shade of first object
The direction at place and angle, determine the shadow region, including:
According to the environmental information, the second object between first object and the light source is determined;
According to the relative position relation between first object, second object and the light source, described second pair is determined
As the light shielded area caused to first object;
Direction and angle and the light shielded area according to where the shade of first object determine the shadow region
Domain.
13. electronic equipment as claimed in claim 9, it is characterised in that the processor determines institute according to the environmental information
The light and shade scale model of the first object is stated, including:
According to the environmental information, the illumination side of relatively described first object of light source in the first object local environment is determined
To;
According to the direction of illumination, reflected intensity ratio of each several part to light of first object is determined;
According to the reflected intensity ratio, the light and shade scale model is determined.
14. electronic equipment as claimed in claim 13, it is characterised in that the processor according to the reflected intensity ratio,
The light and shade scale model is determined, including:
According to the intensity of illumination of the light source, the display brightness ratio of each several part of first object is determined;
According to the reflected intensity ratio and the display brightness ratio, the light and shade scale model is determined.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710203896.7A CN107025683A (en) | 2017-03-30 | 2017-03-30 | A kind of information processing method and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710203896.7A CN107025683A (en) | 2017-03-30 | 2017-03-30 | A kind of information processing method and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107025683A true CN107025683A (en) | 2017-08-08 |
Family
ID=59526387
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710203896.7A Pending CN107025683A (en) | 2017-03-30 | 2017-03-30 | A kind of information processing method and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107025683A (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107590484A (en) * | 2017-09-29 | 2018-01-16 | 百度在线网络技术(北京)有限公司 | Method and apparatus for information to be presented |
CN107749083A (en) * | 2017-09-28 | 2018-03-02 | 联想(北京)有限公司 | The method and apparatus of image shows |
CN108010120A (en) * | 2017-11-30 | 2018-05-08 | 网易(杭州)网络有限公司 | Display methods, device, storage medium, processor and the terminal of static shade |
CN108010118A (en) * | 2017-11-28 | 2018-05-08 | 网易(杭州)网络有限公司 | Virtual objects processing method, virtual objects processing unit, medium and computing device |
CN108961197A (en) * | 2018-06-27 | 2018-12-07 | 联想(北京)有限公司 | A kind of object synthetic method and device |
CN109214351A (en) * | 2018-09-20 | 2019-01-15 | 太平洋未来科技(深圳)有限公司 | A kind of AR imaging method, device and electronic equipment |
WO2019153925A1 (en) * | 2018-02-06 | 2019-08-15 | 北京搜狗科技发展有限公司 | Searching method and related device |
US20200082581A1 (en) * | 2018-09-07 | 2020-03-12 | Industrial Technology Research Institute | Method and apparatus for displaying information of multiple objects |
CN111161393A (en) * | 2019-12-31 | 2020-05-15 | 威创集团股份有限公司 | Real-time light effect dynamic display method and system based on 3D map |
CN111260769A (en) * | 2020-01-09 | 2020-06-09 | 北京中科深智科技有限公司 | Real-time rendering method and device based on dynamic illumination change |
CN111340931A (en) * | 2020-02-17 | 2020-06-26 | 广州虎牙科技有限公司 | Scene processing method and device, user side and storage medium |
CN111462295A (en) * | 2020-03-27 | 2020-07-28 | 咪咕文化科技有限公司 | Shadow processing method, device and storage medium in augmented reality snap |
CN111462294A (en) * | 2020-03-27 | 2020-07-28 | 咪咕视讯科技有限公司 | Image processing method, electronic equipment and computer readable storage medium |
CN113095240A (en) * | 2021-04-16 | 2021-07-09 | 青岛海尔电冰箱有限公司 | Method for identifying information of articles in refrigerator, refrigerator and computer storage medium |
CN114385289A (en) * | 2021-12-23 | 2022-04-22 | 北京字跳网络技术有限公司 | Rendering display method and device, computer equipment and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101520904A (en) * | 2009-03-24 | 2009-09-02 | 上海水晶石信息技术有限公司 | Reality augmenting method with real environment estimation and reality augmenting system |
CN101710429A (en) * | 2009-10-12 | 2010-05-19 | 湖南大学 | Illumination algorithm of augmented reality system based on dynamic light map |
CN102324106A (en) * | 2011-06-02 | 2012-01-18 | 武汉大学 | SFS (Shape From Shading) three-dimensional reconstruction sparse-DEM (Digital Elevation Model) encrypting method considering surface spectral information |
CN102426695A (en) * | 2011-09-30 | 2012-04-25 | 北京航空航天大学 | Virtual-real illumination fusion method of single image scene |
CN105005970A (en) * | 2015-06-26 | 2015-10-28 | 广东欧珀移动通信有限公司 | Augmented reality implementation method and apparatus |
CN105184858A (en) * | 2015-09-18 | 2015-12-23 | 上海历影数字科技有限公司 | Method for augmented reality mobile terminal |
CN105844714A (en) * | 2016-04-12 | 2016-08-10 | 广州凡拓数字创意科技股份有限公司 | Augmented reality based scenario display method and system |
-
2017
- 2017-03-30 CN CN201710203896.7A patent/CN107025683A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101520904A (en) * | 2009-03-24 | 2009-09-02 | 上海水晶石信息技术有限公司 | Reality augmenting method with real environment estimation and reality augmenting system |
CN101710429A (en) * | 2009-10-12 | 2010-05-19 | 湖南大学 | Illumination algorithm of augmented reality system based on dynamic light map |
CN102324106A (en) * | 2011-06-02 | 2012-01-18 | 武汉大学 | SFS (Shape From Shading) three-dimensional reconstruction sparse-DEM (Digital Elevation Model) encrypting method considering surface spectral information |
CN102426695A (en) * | 2011-09-30 | 2012-04-25 | 北京航空航天大学 | Virtual-real illumination fusion method of single image scene |
CN105005970A (en) * | 2015-06-26 | 2015-10-28 | 广东欧珀移动通信有限公司 | Augmented reality implementation method and apparatus |
CN105184858A (en) * | 2015-09-18 | 2015-12-23 | 上海历影数字科技有限公司 | Method for augmented reality mobile terminal |
CN105844714A (en) * | 2016-04-12 | 2016-08-10 | 广州凡拓数字创意科技股份有限公司 | Augmented reality based scenario display method and system |
Non-Patent Citations (3)
Title |
---|
仇璐: "提高增强现实真实感的光影算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
吴亮亮: "基于三维阴影与背景融合的增强现实光照一致性技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
谭茗: "基于视觉的增强现实系统中的光照一致性研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107749083A (en) * | 2017-09-28 | 2018-03-02 | 联想(北京)有限公司 | The method and apparatus of image shows |
CN107590484A (en) * | 2017-09-29 | 2018-01-16 | 百度在线网络技术(北京)有限公司 | Method and apparatus for information to be presented |
CN108010118A (en) * | 2017-11-28 | 2018-05-08 | 网易(杭州)网络有限公司 | Virtual objects processing method, virtual objects processing unit, medium and computing device |
CN108010118B (en) * | 2017-11-28 | 2021-11-30 | 杭州易现先进科技有限公司 | Virtual object processing method, virtual object processing apparatus, medium, and computing device |
CN108010120A (en) * | 2017-11-30 | 2018-05-08 | 网易(杭州)网络有限公司 | Display methods, device, storage medium, processor and the terminal of static shade |
WO2019153925A1 (en) * | 2018-02-06 | 2019-08-15 | 北京搜狗科技发展有限公司 | Searching method and related device |
CN108961197A (en) * | 2018-06-27 | 2018-12-07 | 联想(北京)有限公司 | A kind of object synthetic method and device |
US20200082581A1 (en) * | 2018-09-07 | 2020-03-12 | Industrial Technology Research Institute | Method and apparatus for displaying information of multiple objects |
CN110888522A (en) * | 2018-09-07 | 2020-03-17 | 财团法人工业技术研究院 | Multi-target information display method and device |
TWI691891B (en) * | 2018-09-07 | 2020-04-21 | 財團法人工業技術研究院 | Method and apparatus for displaying information of multiple objects |
CN110888522B (en) * | 2018-09-07 | 2023-06-27 | 财团法人工业技术研究院 | Multiple target information display method and device |
US10755456B2 (en) | 2018-09-07 | 2020-08-25 | Industrial Technology Research Institute | Method and apparatus for displaying information of multiple objects |
CN109214351A (en) * | 2018-09-20 | 2019-01-15 | 太平洋未来科技(深圳)有限公司 | A kind of AR imaging method, device and electronic equipment |
CN109214351B (en) * | 2018-09-20 | 2020-07-07 | 太平洋未来科技(深圳)有限公司 | AR imaging method and device and electronic equipment |
CN111161393A (en) * | 2019-12-31 | 2020-05-15 | 威创集团股份有限公司 | Real-time light effect dynamic display method and system based on 3D map |
CN111161393B (en) * | 2019-12-31 | 2023-10-10 | 威创集团股份有限公司 | Real-time light effect dynamic display method and system based on 3D map |
CN111260769A (en) * | 2020-01-09 | 2020-06-09 | 北京中科深智科技有限公司 | Real-time rendering method and device based on dynamic illumination change |
CN111260769B (en) * | 2020-01-09 | 2021-04-13 | 北京中科深智科技有限公司 | Real-time rendering method and device based on dynamic illumination change |
CN111340931A (en) * | 2020-02-17 | 2020-06-26 | 广州虎牙科技有限公司 | Scene processing method and device, user side and storage medium |
CN111462294A (en) * | 2020-03-27 | 2020-07-28 | 咪咕视讯科技有限公司 | Image processing method, electronic equipment and computer readable storage medium |
CN111462295B (en) * | 2020-03-27 | 2023-09-05 | 咪咕文化科技有限公司 | Shadow processing method, device and storage medium in augmented reality shooting |
CN111462295A (en) * | 2020-03-27 | 2020-07-28 | 咪咕文化科技有限公司 | Shadow processing method, device and storage medium in augmented reality snap |
CN111462294B (en) * | 2020-03-27 | 2024-03-22 | 咪咕视讯科技有限公司 | Image processing method, electronic equipment and computer readable storage medium |
CN113095240A (en) * | 2021-04-16 | 2021-07-09 | 青岛海尔电冰箱有限公司 | Method for identifying information of articles in refrigerator, refrigerator and computer storage medium |
CN113095240B (en) * | 2021-04-16 | 2023-08-29 | 青岛海尔电冰箱有限公司 | Method for identifying article information in refrigerator, refrigerator and computer storage medium |
CN114385289A (en) * | 2021-12-23 | 2022-04-22 | 北京字跳网络技术有限公司 | Rendering display method and device, computer equipment and storage medium |
CN114385289B (en) * | 2021-12-23 | 2024-01-23 | 北京字跳网络技术有限公司 | Rendering display method and device, computer equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107025683A (en) | A kind of information processing method and electronic equipment | |
CN106534835B (en) | A kind of image processing method and device | |
AU2014204252B2 (en) | Extramissive spatial imaging digital eye glass for virtual or augmediated vision | |
CN103810504B (en) | Image processing method and device | |
CN101183276A (en) | Interactive system based on CCD camera porjector technology | |
CN114972617B (en) | Scene illumination and reflection modeling method based on conductive rendering | |
CN111783525A (en) | Aerial photographic image target sample generation method based on style migration | |
CN108525298A (en) | Image processing method, device, storage medium and electronic equipment | |
CN109242961A (en) | A kind of face modeling method, apparatus, electronic equipment and computer-readable medium | |
CN107886562A (en) | Water surface rendering intent, device and readable storage medium storing program for executing | |
CN115115688B (en) | Image processing method and electronic equipment | |
CN109147023A (en) | Three-dimensional special efficacy generation method, device and electronic equipment based on face | |
CN108010118A (en) | Virtual objects processing method, virtual objects processing unit, medium and computing device | |
CN108132712A (en) | The rendering method of state of weather, device and storage medium and terminal in virtual scene | |
CN108043027B (en) | Storage medium, electronic device, game screen display method and device | |
CN109688343A (en) | The implementation method and device of augmented reality studio | |
CN107909638A (en) | Rendering intent, medium, system and the electronic equipment of dummy object | |
Wang et al. | Deep learning‐based vehicle detection with synthetic image data | |
CN103903296A (en) | Method for shadow rendering in virtual home decoration indoor scene design | |
CN112529022A (en) | Training sample generation method and device | |
CN112308977A (en) | Video processing method, video processing apparatus, and storage medium | |
WO2014170757A2 (en) | 3d rendering for training computer vision recognition | |
CN117274353B (en) | Synthetic image data generating method, control device and readable storage medium | |
CN113610955A (en) | Object rendering method and device and shader | |
CN116894922A (en) | Night vision image generation method based on real-time graphic engine |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170808 |