CN107749076B - Method and device for generating real illumination in augmented reality scene - Google Patents

Method and device for generating real illumination in augmented reality scene Download PDF

Info

Publication number
CN107749076B
CN107749076B CN201711072544.9A CN201711072544A CN107749076B CN 107749076 B CN107749076 B CN 107749076B CN 201711072544 A CN201711072544 A CN 201711072544A CN 107749076 B CN107749076 B CN 107749076B
Authority
CN
China
Prior art keywords
virtual object
real scene
illumination
picture
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711072544.9A
Other languages
Chinese (zh)
Other versions
CN107749076A (en
Inventor
休·伊恩·罗伊
李建亿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pacific Future Technology Shenzhen Co ltd
Original Assignee
Pacific Future Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pacific Future Technology Shenzhen Co ltd filed Critical Pacific Future Technology Shenzhen Co ltd
Priority to CN201711072544.9A priority Critical patent/CN107749076B/en
Publication of CN107749076A publication Critical patent/CN107749076A/en
Application granted granted Critical
Publication of CN107749076B publication Critical patent/CN107749076B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/04Supports for telephone transmitters or receivers
    • H04M1/05Supports for telephone transmitters or receivers specially adapted for use on head, throat or breast

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Otolaryngology (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention provides a method and a device for generating reality illumination in an augmented reality scene, and belongs to the technical field of augmented reality. The method comprises the following steps: acquiring a real scene picture corresponding to a current visual angle of a user and first illumination information of the real scene picture; searching a virtual object matched with the picture of the real scene; and generating the illumination effect of the virtual object according to the first illumination information, the position of the virtual object in the picture of the real scene and/or the current view angle of the user. The embodiment of the invention realizes the fusion of the virtual object illumination effect and the real scene, achieves the seamless synthesis of the real scene and the virtual object, and has stronger sense of reality for users.

Description

Method and device for generating real illumination in augmented reality scene
Technical Field
The invention relates to the technical field of augmented reality, in particular to a method and a device for generating real illumination in an augmented reality scene.
Background
Augmented Reality (AR) is a technology in which a virtual object generated by a computer is superimposed on a real scene by means of hardware and software devices. The augmented reality technology augments and expands the real world by fusing the real object and the virtual object with each other, and provides a space for the virtual world and the real world to be connected. By using an AR device, a user can perceive the presence of a virtual object in the real world, for example: when the user adopts the head-wearing AR equipment, real environment data are collected through a camera device in the equipment, and then virtual effects generated by a computer are fused with the real environment data. Specific application scenes are diversified, for example, in the home of the user, the head-mounted AR helmet can fuse virtual decoration effects with a real home environment and the like. In fact, the AR helmet may adopt a similar design structure to a common VR helmet in the market, and when a smart phone is used in cooperation with a specially-made lens to play a complete virtual picture, the AR helmet is a VR device.
However, existing AR devices suffer from software and hardware drawbacks as follows:
because the virtual objects are generated in advance through a computer, the illumination information of the real scene cannot be acquired in advance, and therefore the conditions that the illumination effects of the virtual objects and the real scene are different and the virtual objects and the real scene cannot be fused occur, a user visually feels the incongruity of the appearances of virtual and real scenery, and the seamless synthesis of the real scene and the virtual objects cannot be achieved.
Present AR helmet, the installation of cell-phone with take out inconveniently, fish tail cell-phone surface easily when installation and take out, and splint compress tightly the cell-phone backshell for a long time, be unfavorable for the cell-phone heat dissipation, to different screen size, the cell-phone of thickness need set up complicated structure and carry out the adaptability and adjust, this structure also can't be adjusted the dynamics of pressing from both sides tight cell-phone, and also do not benefit to the heat dissipation of cell-phone, appear the shake easily in the use, rock phenomenons such as, influence the sense of immersing of user in the use, perhaps cause the user to produce uncomfortable feelings such as dizzy even.
Disclosure of Invention
The method and the device for generating the real illumination in the augmented reality scene provided by the embodiment of the invention are used for solving at least one of the problems in the related art.
An embodiment of the present invention provides a method for generating real illumination in an augmented reality scene, including:
acquiring a real scene picture corresponding to a current visual angle of a user and first illumination information of the real scene picture;
searching a virtual object matched with the picture of the real scene;
and generating the illumination effect of the virtual object according to the first illumination information, the position of the virtual object in the picture of the real scene and/or the current view angle of the user.
Optionally, the generating, according to the first illumination information, the position of the virtual object in the real scene picture, and/or the current view angle of the user, an illumination effect of the virtual object includes: determining the direction of incident light according to the current visual angle of the user and the position of the virtual object in the picture of the real scene; calculating second illumination information reflected to the eyes of the user by the virtual object according to the incident light direction and the first illumination information; and generating the illumination effect of the virtual object according to the second illumination information.
Optionally, the generating, according to the first illumination information and the position of the virtual object in the picture of the real scene, an illumination effect of the virtual object includes: generating diffuse reflection information of the real scene picture according to the first illumination information; and generating the illumination effect of the virtual object according to the position of the virtual object in the picture of the real scene and the diffuse reflection information.
Optionally, the generating diffuse reflection information of the real scene according to the first light and shadow information includes: determining the normal direction of the real scene picture at a preset position; and calculating diffuse reflection illumination information of the pixels in the real scene picture in the normal direction according to the first illumination information.
Optionally, the generating an illumination effect of the virtual object according to the position of the virtual object in the real scene picture and the diffuse reflection information includes: determining the normal direction of the virtual object according to the position of the virtual object in the picture of the real scene; acquiring diffuse reflection illumination information of the virtual object from the diffuse reflection information of the real scene picture according to the normal direction of the virtual object; and generating the illumination effect of the virtual object according to the diffuse reflection information of the virtual object.
Optionally, the method is applied to an AR helmet, comprising a grip, a lens and a head-mount,
the clamping part comprises a base, a base plate and an inner frame, the base plate and the inner frame are both arranged on the base, the inner frame is arranged on one side close to the lens part, the base plate is arranged on one side far away from the lens part, a clamping device is arranged on the base plate and comprises an installation hole, an installation cover, a first bolt, a guide sleeve and a guide pin, the installation cover, the first bolt, the guide sleeve and the guide pin are arranged in the installation hole, the installation hole comprises a first section and a second section which are adjacent, the inner diameter of the first section is smaller than that of the second section, an end cover is arranged on the outer end of the second section, an adjusting ring is arranged at the end part of the second section close to the first section, a limit flange which is matched with the adjusting ring and limits the moving stroke of the guide sleeve is arranged at the inner end of the guide sleeve, and a shaft hole is arranged on, the first bolt is installed on the installation cover through the shaft hole, the outer end part of the first bolt is connected with a first screwing piece, the inner end part of the first bolt is in threaded connection with the inner end part of a guide sleeve installed in the installation hole, the outer end part of the guide sleeve is provided with a pressing end for pressing a mobile phone, the outer wall of the guide sleeve is provided with a groove matched with the guide pin along the horizontal direction, one end of the guide pin is installed on the inner wall of the installation hole, and the other end of the guide pin is installed in the groove;
the mobile phone obtains a real scene picture corresponding to a current visual angle of a user and first illumination information of the real scene picture through a camera device carried by the mobile phone, searches for a virtual object matched with the real scene picture, and generates an illumination effect of the virtual object according to the first illumination information, the position of the virtual object on the real scene picture and/or the current visual angle of the user.
Optionally, the AR helmet the clamping part with lens tip sliding fit, lens tip is equipped with a mounting panel, the clamping part is installed on the mounting panel, the mounting panel is equipped with a plurality of gyro wheels along its width direction uniform interval, the clamping part has the locking the uide bushing with the locking structure of gyro wheel.
Optionally, the locking structure of the AR helmet comprises a return spring, and a sleeve and a threaded sleeve which are bilaterally symmetric about the guide sleeve and are arranged below the guide sleeve, the upper parts of the inner ends of the sleeve and the threaded sleeve are provided with first locking parts matched with the outer wall of the lower part of the guide sleeve in size, the lower parts of the inner ends of the sleeve and the thread sleeve are provided with second locking parts matched with the size of the roller, the inner end of the sleeve is provided with a first spring groove, the inner end of the threaded sleeve is provided with a second spring groove, one end of the return spring is arranged in the first spring groove, the other end of the return spring is arranged in the second spring groove, the sleeve and the threaded sleeve are internally provided with a second bolt, the sleeve and the threaded sleeve are connected through the second bolt and a locking nut matched with the second bolt, and at least one end part of the second bolt is provided with a second screwing piece.
Optionally, the pressing end of the AR helmet extends with a plurality of support bars, the end of each support bar is provided with a support point connected with the rear shell of the mobile phone, the support bar is provided with a micro fan, the micro fan is provided with a touch switch, the support bar is provided with at least one through hole, a driving piece made of shape memory alloy is installed in the through hole, one end of the driving piece is connected with the touch switch, the other end of the driving piece abuts against the rear shell of the mobile phone, the driving piece is in a martensite state when the temperature of the rear shell of the mobile phone reaches an early warning value, the micro fan is turned on through the touch switch, the driving piece is in an austenite state when the temperature of the rear shell of the mobile phone is lower than the early warning;
the base plate is provided with a groove matched with the first screwing piece, and the first screwing piece is located in the groove.
Another aspect of the embodiments of the present invention provides a device for generating real illumination in an augmented reality scene, including:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a real scene picture corresponding to a current visual angle of a user and first illumination information of the real scene picture;
the searching module is used for searching a virtual object matched with the picture of the real scene;
and the generating module is used for generating the illumination effect of the virtual object according to the first illumination information, the position of the virtual object in the picture of the real scene and/or the current view angle of the user.
Optionally, the generating module includes: the determining unit is used for determining the direction of incident light according to the current visual angle of the user and the position of the virtual object in the picture of the real scene; the computing unit is used for computing second illumination information reflected by the virtual object to the eyes of the user according to the incident light direction and the first illumination information; and the first generation subunit is used for generating the illumination effect of the virtual object according to the second illumination information.
Optionally, the generating module includes: the second generation unit is used for generating diffuse reflection information of the real scene picture according to the first illumination information; and the third generating unit is used for generating the illumination effect of the virtual object according to the position of the virtual object in the real scene picture and the diffuse reflection information.
Optionally, the second generating subunit is configured to determine a normal direction of the real scene picture at a preset position; and calculating diffuse reflection illumination information of the pixels in the real scene picture in the normal direction according to the first illumination information.
Optionally, the third generating subunit is configured to: determining the normal direction of the virtual object according to the position of the virtual object in the picture of the real scene; acquiring diffuse reflection illumination information of the virtual object from the diffuse reflection information of the real scene picture according to the normal direction of the virtual object; and generating the illumination effect of the virtual object according to the diffuse reflection information of the virtual object.
Another aspect of an embodiment of the present invention provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method for generating real illumination in an augmented reality scene according to any one of the embodiments of the present invention.
According to the technical scheme, the method, the device and the electronic equipment for generating the real illumination in the augmented reality scene provided by the embodiment of the invention have the advantages that the real scene picture corresponding to the current visual angle of the user and the first illumination information of the real scene picture are obtained; searching a virtual object matched with the picture of the real scene; and generating the illumination effect of the virtual object according to the first illumination information, the position of the virtual object in the picture of the real scene and/or the current view angle of the user. The embodiment of the invention realizes the fusion of the virtual object illumination effect and the real scene, achieves the seamless synthesis of the real scene and the virtual object, and has stronger sense of reality for users. Meanwhile, the mechanical structure of the AR helmet based on the method is well designed, so that the mobile phone can be better taken and placed, the heat dissipation of the mobile phone is more facilitated, the phenomena of shaking, shaking and the like are not easy to occur in the using process, and the immersion and reality of a user in the using process are enhanced.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present invention, and it is also possible for a person skilled in the art to obtain other drawings based on the drawings.
Fig. 1 is a flowchart of a method for generating reality illumination in an augmented reality scene according to an embodiment of the present invention;
fig. 2 is a flowchart of a method for generating reality illumination in an augmented reality scene according to an embodiment of the present invention;
FIG. 3 is a block diagram of an apparatus for generating real illumination in an augmented reality scene according to an embodiment of the present invention;
FIG. 4 is a block diagram of an apparatus for generating real illumination in an augmented reality scene according to an embodiment of the present invention;
FIG. 5 is a block diagram of an apparatus for generating real illumination in an augmented reality scene according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a hardware structure of an electronic device for generating a real illumination method in an augmented reality scene according to an embodiment of the present invention;
FIG. 7 is a schematic structural view of an AR helmet provided in accordance with an embodiment of the present invention;
FIG. 8 is a schematic diagram of a clamping device of an AR helmet according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a locking structure of an AR helmet according to an embodiment of the present invention;
fig. 10 is a schematic structural view of a support bar of an AR helmet according to an embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present invention, the technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments of the present invention shall fall within the scope of the protection of the embodiments of the present invention.
The execution subject of the embodiment of the invention is electronic equipment, and the electronic equipment comprises but is not limited to a mobile phone, a tablet computer, a head-mounted AR (augmented reality) device and AR glasses. In order to better explain the following embodiments, the application scenario of the present invention is explained first. When a user watches a real scene by using the electronic equipment, the real content of the real scene is presented, and the virtual object generated by the user computer is presented, the virtual object and the real content coexist in the same picture, and an augmented reality environment integrating the virtual object and the real content is presented to the user from the sense and experience effects.
Some embodiments of the invention are described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Fig. 1 is a flowchart of a method for generating reality illumination in an augmented reality scene according to an embodiment of the present invention. As shown in fig. 1, the method for generating a real illumination in an augmented reality scene provided by the embodiment of the present invention specifically includes:
s101, acquiring a real scene picture corresponding to a current visual angle of a user and first illumination information of the real scene picture.
In this step, when the user watches the augmented effect of the virtual object superimposed on the real scene through the electronic device, the electronic device may acquire the current viewing angle of the user through a gyroscope or a gravity sensor, and determine the range in which the user can watch the real scene according to the viewing angle, so as to obtain the picture of the real scene.
In this step, the first illumination information of the real scene picture includes, but is not limited to, the intensity of the light source, the position of the light source, and the like. Optionally, it is first determined whether the real scene picture is under the illumination of the sun according to the real scene picture, and if the real scene is under the illumination of the sun, the first illumination information is illumination information of the sunlight, and the illumination information of the sunlight may be obtained by:
besides the difference of the areas, the illumination of the sun has different angles and heights of the running track relative to the ground illumination in different seasons or different times of a day, so that the azimuth parameters (the elevation angle and the azimuth angle) of the sun need to be calculated according to the longitude and latitude and the current time information, and the azimuth parameters are inquired in a pre-established corresponding relation table of the azimuth parameters of the sun and the illumination information to obtain first illumination information of the real scene picture. In a specific implementation process, the longitude and latitude corresponding to the geographical position of the real scene can be determined, and the time of the current real scene is determined according to the brightness of the real scene and/or object information (for example, the time of a clock in the real scene, the activity state of a person, and the like).
S102, searching a virtual object matched with the real scene picture.
The virtual object is an object superimposed in a real scene and visible to a user through an electronic device, and may include virtual contents such as a real object image (e.g., an image of a real object such as a person, an animal, or an article), a special effect (e.g., a smoke effect, a steam effect, a motion trajectory effect, and the like), a natural phenomenon (e.g., rain, snow, a rainbow, a sun aperture, and the like), and may also replace a certain part of the person, the animal, the article, information, and the like in the real scene, and the virtual object may be static or dynamic, which is not limited herein. The virtual object matched with the picture of the real scene can be a virtual object matched with the characteristics of the real scene, and can also be a virtual object embodied by the matching of the real scene and the surrounding scenery.
Specifically, virtual objects corresponding to different scenes may be preset, in this step, a target scene with the highest similarity to a real scene is determined according to an image recognition technology, and a virtual object matching the real scene is obtained by searching for the virtual object of the target scene object; or a virtual object library may be established in advance, a target object to be enhanced in the real scene is obtained by analyzing the picture of the real scene, and a virtual object matched with the target object is searched in the virtual object library and is used as the virtual object matched with the real scene. When the virtual object is found, the position of the virtual object in the real scene picture is also determined.
For example, in a real scene, a target object with augmentation can be recognized as character a, one hand of a is making a call, the other hand is scratching the head, and the face is shy and smiling, so that a can be judged to be nervous and shy, and therefore, an appointment-related virtual object can be matched for a.
S103, generating an illumination effect of the virtual object according to the first illumination information, the position of the virtual object in the picture of the real scene and/or the current view angle of the user.
In order to realize the fusion of the illumination effects of the virtual object and the real scene, namely, the consistency of the illumination information reflected by the real scene and the illumination information reflected by the virtual object is realized. Accordingly, an illumination effect of the virtual object may be generated from the reflected illumination information of the virtual object. The reflection can be divided into diffuse reflection and specular reflection according to the roughness of the surface of the virtual object, and when the roughness of the surface of the virtual object is greater than a certain threshold value, the illumination effect of the virtual object is generated according to illumination information determined by the diffuse reflection; and when the roughness of the surface of the virtual object is less than or equal to a certain threshold value, generating the illumination effect of the virtual object according to the illumination information determined by the specular reflection.
For the specular reflection, an illumination effect of the virtual object may be generated according to the current viewing angle of the user determined in step S101, the first illumination information, and the position of the virtual object in the real scene picture determined in step S102.
Specifically, the direction of incident light is determined according to the current viewing angle of the user and the position of the virtual object in the picture of the real scene. The current visual angle direction of a user can be used as the emergent direction of reflected light, the normal direction of any point on the surface of the virtual object is obtained, and the incident direction of the light source is calculated according to the reflection law of the light; the normal directions of a plurality of points on the surface of the virtual object can also be respectively obtained, and the average value of the incident directions of a plurality of light sources obtained through the law of light reflection is obtained to obtain the average incident direction of the light sources.
Calculating second illumination information reflected to the eyes of the user by the virtual object according to the incident light direction and the first illumination information; and generating the illumination effect of the virtual object according to the second illumination information. Because the mirror reflection can cause the phenomenon of light reflection of the virtual object, the virtual object is not clearly displayed in the eyes of a user, and therefore under the condition that the first illumination information is known, the second illumination information, reflected by the surface of the virtual object, of the incident light in a certain range around the incident direction can be calculated according to the roughness of the surface of the virtual object. The clear illumination effect of the virtual object is obtained by expanding the range of the incident direction, the more smooth the surface of the virtual object is, the stronger the specular reflection effect is, if the second illumination information is obtained only according to the incident direction determined in the step S101, the emergent light direction is unique, the illumination effect of the virtual object is fuzzy due to the mirror surface emission, and at the moment, the definition of the illumination effect of the virtual object can be increased by selecting the incident light in a certain range around the incident direction, wherein the emergent light direction is not single. The second illumination information comprises the illumination intensity and the direction of the emergent light, and the illumination effect of the virtual object is generated according to the illumination intensity and the direction of the emergent light.
In general, the surface of the virtual object is not absolutely smooth, so the diffuse reflection condition is common. For diffuse reflection, the lighting effect of the virtual object may be generated according to the first lighting information determined in step S101 and the position of the virtual object in the real scene picture determined in step S102.
Specifically, diffuse reflection information of the real scene picture is generated according to the first light and shadow information. As an optional implementation manner of this embodiment, first, the normal direction of the real scene picture at the preset position is determined, and some preset positions may be preset according to the position distribution of each object in the real scene picture, and the normal directions of these positions are determined; and then according to the first illumination information, calculating diffuse reflection illumination information of pixels in the picture of the real scene in the normal direction, wherein the pixels include but are not limited to pixels corresponding to a light source, pixels corresponding to an object capable of reflecting light in the picture of the real scene, and the like.
And generating the illumination effect of the virtual object according to the position of the virtual object in the picture of the real scene and the diffuse reflection information. Firstly, determining the normal direction of a virtual object according to the position of the virtual object in a real scene picture; acquiring diffuse reflection illumination information of the virtual object from the diffuse reflection information of the real scene picture according to the normal direction of the virtual object, namely searching the normal direction of a preset position which is the same as or close to the normal direction of the virtual object from the diffuse reflection information of the real scene picture, and acquiring the diffuse reflection illumination information of the virtual object according to the diffuse reflection illumination information in the normal direction of the preset position; and finally, generating the illumination effect of the virtual object according to the diffuse reflection information of the virtual object.
After step S103, the virtual object generating the illumination effect may be further synthesized into a real scene and projected into the eyes of the user through the electronic device, so that the user feels the effect of augmented reality.
The embodiment of the invention obtains a real scene picture corresponding to the current visual angle of a user and first illumination information of the real scene picture; searching a virtual object matched with the picture of the real scene; and generating the illumination effect of the virtual object according to the first illumination information, the position of the virtual object in the picture of the real scene and/or the current view angle of the user. The embodiment of the invention realizes the fusion of the virtual object illumination effect and the real scene, achieves the seamless synthesis of the real scene and the virtual object, and has stronger sense of reality for users.
Fig. 2 is a flowchart of a method for generating reality illumination in an augmented reality scene according to an embodiment of the present invention. As shown in fig. 2, this embodiment is a specific implementation scheme of the embodiment shown in fig. 1, and therefore details of specific implementation methods and beneficial effects of each step in the embodiment shown in fig. 1 are not described again, and the method for generating real illumination in an augmented reality scene provided in the embodiment of the present invention specifically includes:
s201, acquiring a real scene picture corresponding to a current visual angle of a user and first illumination information of the real scene picture.
S202, searching a virtual object matched with the real scene picture.
S203, judging whether the roughness of the surface of the virtual object is larger than a preset threshold value.
If the roughness of the virtual object surface is less than or equal to the preset threshold, step S204-step S206 are executed, and if the roughness of the virtual object surface is greater than the preset threshold, step S207 and step S208 are executed.
S204, determining the direction of incident light according to the current visual angle of the user and the position of the virtual object in the picture of the real scene.
S205, calculating second illumination information reflected to the eyes of the user by the virtual object according to the incident light direction and the first illumination information.
S206, generating the illumination effect of the virtual object according to the second illumination information.
And S207, generating diffuse reflection information of the real scene picture according to the first illumination information.
S208, generating an illumination effect of the virtual object according to the position of the virtual object on the picture of the real scene and the diffuse reflection information.
The embodiment of the invention obtains a real scene picture corresponding to the current visual angle of a user and first illumination information of the real scene picture; searching a virtual object matched with the picture of the real scene; and generating the illumination effect of the virtual object according to the first illumination information, the position of the virtual object in the picture of the real scene and/or the current view angle of the user. The embodiment of the invention realizes the fusion of the virtual object illumination effect and the real scene, achieves the seamless synthesis of the real scene and the virtual object, and has stronger sense of reality for users.
Fig. 3 is a structural diagram of a device for generating real illumination in an augmented reality scene according to an embodiment of the present invention. As shown in fig. 3, the apparatus specifically includes: an obtaining module 1000, a searching module 2000, and a generating module 3000.
The acquiring module 1000 is configured to acquire a real scene picture corresponding to a current viewing angle of a user and first illumination information of the real scene picture; the searching module 2000 is configured to search for a virtual object matching the real scene picture; the generating module 3000 is configured to generate an illumination effect of the virtual object according to the first illumination information, the position of the virtual object in the real scene picture, and/or the current view angle of the user.
Optionally, as shown in fig. 4, the generating module 3000 includes a determining unit 3100, a calculating unit 3200, and a first generating unit 3300.
The determining unit 3100, configured to determine a direction of incident light according to the current viewing angle of the user and the position of the virtual object in the real scene picture; the calculation unit 3200 is configured to calculate second illumination information, which is reflected by the virtual object to the eyes of the user, according to the incident light direction and the first illumination information; the first generating unit 3300 is configured to generate an illumination effect of the virtual object according to the second illumination information.
The device for generating reality illumination in an augmented reality scene provided by the embodiment of the present invention is specifically configured to execute the method provided by the embodiment shown in fig. 1 and/or fig. 2, and the implementation principle, method, and functional use thereof are similar to those of the embodiment shown in fig. 1 and/or fig. 2, and are not described herein again.
Fig. 5 is a structural diagram of a device for generating real illumination in an augmented reality scene according to an embodiment of the present invention. As shown in fig. 5, the apparatus specifically includes: an obtaining module 1000, a searching module 2000, and a generating module 3000.
The acquiring module 1000 is configured to acquire a real scene picture corresponding to a current viewing angle of a user and first illumination information of the real scene picture; the searching module 2000 is configured to search for a virtual object matching the real scene picture; the generating module 3000 is configured to generate an illumination effect of the virtual object according to the first illumination information, the position of the virtual object in the real scene picture, and/or the current view angle of the user.
Optionally, as shown in fig. 5, the generating module 3000 includes a second generating unit 340 and a third generating unit 3500.
The second generating unit 3400 is configured to generate diffuse reflection information of the real scene picture according to the first illumination information; the third generating unit 3500, configured to generate an illumination effect of the virtual object according to the position of the virtual object in the real scene picture and the diffuse reflection information.
Optionally, the second generating subunit 3400 is configured to determine a normal direction of the real scene picture at a preset position; and calculating diffuse reflection illumination information of the pixels in the real scene picture in the normal direction according to the first illumination information.
Optionally, the third generating subunit is configured to: determining the normal direction of the virtual object according to the position of the virtual object in the picture of the real scene; acquiring diffuse reflection illumination information of the virtual object from the diffuse reflection information of the real scene according to the normal direction of the virtual object; and generating the illumination effect of the virtual object according to the diffuse reflection information of the virtual object.
The device for generating reality illumination in an augmented reality scene provided by the embodiment of the present invention is specifically configured to execute the method provided by the embodiment shown in fig. 1 and/or fig. 2, and the implementation principle, method, and functional use thereof are similar to those of the embodiment shown in fig. 2, and are not described herein again.
The device for generating the real illumination in the augmented reality scene according to the embodiments of the present invention may be independently disposed in the electronic device as one of the software or hardware functional units, or may be integrated in the processor as one of the functional modules, to execute the method for generating the real illumination in the augmented reality scene according to the embodiments of the present invention.
Fig. 6 is a schematic diagram of a hardware structure of an electronic device for generating a real illumination method in an augmented reality scene according to an embodiment of the method of the present invention. As shown in fig. 6, the electronic device includes:
one or more processors 6100 and a memory 6200, in fig. 6, one processor 6100 is taken as an example.
The apparatus for executing the method for generating the real illumination in the augmented reality scene may further include: an input device 6300 and an output device 6300.
The processor 6100, the memory 6200, the input device 6300, and the output device 6400 may be connected by a bus or other means, and fig. 6 illustrates examples of connection by a bus.
The memory 6200, as a non-volatile computer-readable storage medium, may be used to store a non-volatile software program, a non-volatile computer-executable program, and modules, such as program instructions/modules corresponding to the method for generating real illumination in the augmented reality scene in the embodiment of the present invention. The processor 6100 executes various functional applications and data processing of the server by running the nonvolatile software program, instructions, and modules stored in the memory 6200, so as to implement the method for generating the real illumination in the augmented reality scene.
The memory 6200 may include a storage program area and a storage data area, where the storage program area may store an operating system, an application program required by at least one function; the storage data area may store data created by use of a device for generating real illumination in an augmented reality scene provided according to an embodiment of the present invention, and the like. In addition, the memory 6200 may include a high-speed random access memory 6200, and may further include a nonvolatile memory 6200, such as at least one disk memory 6200 device, a flash memory device, or other nonvolatile solid state memory 6200 devices. In some embodiments, the memory 6200 may optionally include memory 6200 remotely located with respect to the processor, these remote memories 6200 may be connected over a network to the real-lighting generating devices in the augmented reality scene. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 6300 may receive input numeric or character information and generate key signal inputs related to user settings and function controls of the real illumination device generated in the augmented reality scene. The input device 6300 may include a pressing module and the like.
The one or more modules are stored in the memory 6200, which when executed by the one or more processors 6100, perform a method of generating real illumination in the augmented reality scene.
The electronic device of embodiments of the present invention exists in a variety of forms, including but not limited to:
(1) mobile communication devices, which are characterized by mobile communication capabilities and are primarily targeted at providing voice and data communications. Such terminals include smart phones (e.g., iphones), multimedia phones, functional phones, and low-end phones, among others.
(2) The ultra-mobile personal computer equipment belongs to the category of personal computers, has calculation and processing functions and generally has the characteristic of mobile internet access. Such terminals include PDA, MID, and UMPC devices, such as ipads.
(3) Portable entertainment devices such devices may display and play multimedia content. Such devices include audio and video players (e.g., ipods), handheld game consoles, electronic books, as well as smart toys and portable car navigation devices.
(4) The server is similar to a general computer architecture, but has higher requirements on processing capability, stability, reliability, safety, expandability, manageability and the like because of the need of providing highly reliable services.
(5) And other electronic devices with data interaction functions.
The above-described embodiments of the apparatus are merely illustrative, wherein the modules described as separate parts may or may not be physically separate, and the parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The embodiment of the present invention provides a non-transitory computer-readable storage medium, where the computer-readable storage medium stores computer-executable instructions, and when the computer-executable instructions are executed by an electronic device, the electronic device is caused to execute a method for generating real illumination in an augmented reality scene in any method embodiment described above.
Embodiments of the present invention provide a computer program product, where the computer program product includes a computer program stored on a non-transitory computer readable storage medium, where the computer program includes program instructions, where the program instructions, when executed by an electronic device, cause the electronic device to perform the method for generating real illumination in an augmented reality scene in any of the above-mentioned method embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions and/or portions thereof that contribute to the prior art may be embodied in the form of a software product that can be stored on a computer-readable storage medium including any mechanism for storing or transmitting information in a form readable by a computer (e.g., a computer). For example, a machine-readable medium includes Read Only Memory (ROM), Random Access Memory (RAM), magnetic disk storage media, optical storage media, flash memory storage media, electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.), and others, and the computer software product includes instructions for causing a computing device (which may be a personal computer, server, or network device, etc.) to perform the methods described in the various embodiments or portions of the embodiments.
In another embodiment, fig. 7 provides an AR helmet as an implementation device of the above-mentioned rendering method of the light and shadow effect of the virtual object, the AR helmet includes a clamping portion 1, a lens portion 2 and a head-mounted portion 3, wherein the clamping portion 1 includes a base 101, a base plate 102 and an inner frame 103, the base plate 102 and the inner frame 103 are both vertically mounted on the base 101, the base plate 102 is a plate-shaped structure, the inner frame 103 is a frame structure adapted to the lens portion, the base plate 102 and the inner frame 103 are located in front of and behind the base 101, that is, the inner frame 103 is located at a side close to the lens portion 2, the base plate 102 is located at a side far away from the lens portion 2, and an electronic device such as a mobile phone is mounted between the base plate 102.
Another improvement of this embodiment is shown in conjunction with fig. 8 and 9: the clamping device 4 for clamping the mobile phone is arranged on the base plate 101, the clamping device 4 comprises a mounting hole 401, a mounting cover 402, a first bolt 403, a guide sleeve 404, a guide pin 405 and other structures, the mounting hole 401 is provided with a first end far away from the inner frame 401 and a second end close to the inner frame, specifically, the mounting hole 401 comprises a first section and a second section which are adjacent, the inner diameter of the first section is smaller than that of the second section, the end cover 402 is mounted on the outer end of the second section, an adjusting ring 407 is mounted at the end part, close to the first section, of the second section, and a limiting flange 408 which is matched with the adjusting ring 407 and limits the moving stroke of the guide sleeve is arranged at the.
The first end is provided with a mounting cover 402, the mounting cover 402 is provided with a shaft hole 4021, a first bolt 403 is mounted on the mounting cover 402 through the shaft hole 4021, the outer end of the first bolt 403 is connected with a first screwing piece 406, the inner end of the first bolt 403 is in threaded connection with the inner end of a guide sleeve 404 mounted in the mounting hole 401, the outer end of the guide sleeve 404 is provided with a pressing end 4041 for pressing the mobile phone, the outer wall of the guide sleeve 404 is provided with a groove (not shown) matched with a guide pin 405 in the horizontal direction, one end of the guide pin 405 is mounted on the inner wall of the mounting hole 401, and the other end of the guide pin 405 is mounted in. When a user rotates the first screwing piece 406, the first screw rod 403 is driven to rotate, the guide sleeve 404 is driven to rotate and move forwards/backwards, the guide sleeve only has forward or backward displacement due to the existence of the guide pin, the pressing end 4041 is pressed on the mobile phone and the inner frame 103, the process can realize slow output of the pressing end, the pressing force is adjustable, damage to a rear shell of the mobile phone can be avoided, the mobile phone is fixed through a point structure of the supporting end, the effect is superior to that of fixing of a clamping plate or a face shell in the prior art, the heat dissipation performance of the mobile phone is not affected, the structure is strong in adaptability, and the mobile phone is suitable for mobile phones with various screen sizes and thicknesses.
The applicant finds that part of mobile phones are not provided with functions of switching playing programs and zooming sounds in an AR scene, so that most users can only take the mobile phones out of the clamping mechanism for playing switching and adjusting sounds and pictures when needing the operations, so that the applicant designs the clamping part 1 and the lens part 2 to be in sliding fit, specifically, the lens part 2 is provided with the mounting plate 201, the clamping part 1 is mounted on the mounting plate 201, the mounting plate 201 is provided with a plurality of rollers 2011 at uniform intervals along the width direction of the mounting plate 201, and more favorably, the clamping part and the lens are in sliding fit, so that the mobile phones can be taken out when needing to operate the mobile phones, and the clamping part is pushed back to the original position for watching after the operations are finished, and the operation is convenient and fast.
Referring to fig. 9, in this embodiment, a locking structure 104 capable of locking the guide sleeve and the roller is further disposed on the clamping portion 1, and the locking structure 104 not only can prevent the first bolt from being reset, but also can lock the sliding fit between the clamping portion and the lens portion 2. Specifically, the locking structure 104 of this embodiment includes a return spring 1041, and a sleeve 1042 and a screw sleeve 1043 which are bilaterally symmetric with respect to the guide sleeve 404 and are disposed below the guide sleeve 404, an upper portion of an inner end of the sleeve 1042 and the screw sleeve 1043 has a first locking portion 1044 which is matched with the outer wall of the lower portion of the guide sleeve in size, a lower portion of the inner end of the sleeve 1042 and the screw sleeve 1043 has a second locking portion 1045 which is matched with the roller 2011 in size, the inner end of the sleeve 1042 is provided with a first spring slot 1046, the inner end of the screw sleeve 1043 is provided with a second spring slot 1047, one end of the return spring 1041 is mounted in the first spring slot 1046, the other end of the return spring is mounted in the second spring slot 1047, a second bolt 1048 is mounted in the sleeve 1042 and the screw sleeve 1043, the sleeve 1042 and the screw sleeve 1043 are connected by the second bolt 1048 and a locking nut 1049 which is matched with the second bolt 1048, and at least one end. The locking structure 104 can fix the guide sleeve 404, and can lock the sliding fit of the clamping part 1 and the lens part 2, thereby realizing the multifunction and simplified structure of one structure.
In addition, the applicant also finds that most of the existing AR helmets do not have a mobile phone heat dissipation structure, or the heat dissipation of the mobile phone is realized through a complex temperature sensor, a complex controller and other structures, the structure is complex, the manufacturing cost is high, the size of the AR helmet is greatly increased, and the light weight cannot be realized. Therefore, the applicant improves on the above-mentioned basis, referring to fig. 10, in this embodiment, a plurality of supporting bars 5 parallel to the mobile phone rear case extend from a pressing end 4041, a supporting point 501 connected to the mobile phone rear case is provided at an end of the supporting bar 5, a micro fan 6 is installed on the supporting bar 5, the micro fan 6 is provided with a touch switch (not shown in the figure), at least one through hole 502 is provided on the supporting bar 5, a driving member 503 made of shape memory alloy is installed in the through hole 502, one end of the driving member 503 is connected to the touch switch, the other end of the driving member 503 abuts against the mobile phone rear case, the driving member 503 is in a martensite state when the temperature of the mobile phone rear case reaches an early warning value, the driving member 503 is in an austenite state when the temperature of the mobile phone rear case is. The miniature fan is switched on and off by utilizing the shape change of the shape memory alloy under the temperature change, so that the precision is higher, the cooling of the mobile phone is facilitated, the loss of the mobile phone is avoided, a control structure is not needed, the cooling structure is simplified, and the production cost and the installation space are reduced.
In addition, a groove matched with the first screwing piece can be arranged on the base plate 101, and the first screwing piece 406 is positioned in the groove. The outer surface of the base plate can be in a plane structure by arranging the screwing piece in the groove, so that the appearance is simplified.
The intelligent mobile phone acquires a real scene picture corresponding to a current visual angle of a user and first illumination information of the real scene picture through a camera device carried by the intelligent mobile phone, searches for a virtual object matched with the real scene picture, and generates an illumination effect of the virtual object according to the first illumination information, the position of the virtual object on the real scene picture and/or the current visual angle of the user.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the embodiments of the present invention, and not to limit the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (7)

1. A method of generating real illumination in an augmented reality scene, comprising:
acquiring a real scene picture corresponding to a current visual angle of a user and first illumination information of the real scene picture, wherein the first illumination information comprises the intensity of a light source and the position of the light source;
searching a virtual object matched with the picture of the real scene;
determining the direction of incident light according to the current visual angle of the user and the position of the virtual object in the picture of the real scene;
calculating second illumination information reflected to the eyes of the user by the virtual object according to the incident light direction and the first illumination information;
generating an illumination effect of the virtual object according to the second illumination information;
the method is applied to an AR helmet comprising a grip portion, a lens portion and a head-mount portion,
the clamping part comprises a base, a base plate and an inner frame, the base plate and the inner frame are both arranged on the base, the inner frame is arranged on one side close to the lens part, the base plate is arranged on one side far away from the lens part, a clamping device is arranged on the base plate and comprises an installation hole, an installation cover, a first bolt, a guide sleeve and a guide pin, the installation cover, the first bolt, the guide sleeve and the guide pin are arranged in the installation hole, the installation hole comprises a first section and a second section which are adjacent, the inner diameter of the first section is smaller than that of the second section, the installation cover is arranged on the outer end of the second section, the end part of the second section close to the first section is provided with an adjusting ring, the inner end of the guide sleeve is provided with a limit flange which is matched with the adjusting ring and limits the moving stroke of the guide sleeve, and the installation cover is provided with a, the first bolt is installed on the installation cover through the shaft hole, the outer end part of the first bolt is connected with a first screwing piece, the inner end part of the first bolt is in threaded connection with the inner end part of a guide sleeve installed in the installation hole, the outer end part of the guide sleeve is provided with a pressing end for pressing a mobile phone, the outer wall of the guide sleeve is provided with a groove matched with the guide pin along the horizontal direction, one end of the guide pin is installed on the inner wall of the installation hole, and the other end of the guide pin is installed in the groove;
the mobile phone obtains a real scene picture corresponding to a current visual angle of a user and first illumination information of the real scene picture through a camera device carried by the mobile phone, searches for a virtual object matched with the real scene picture, and generates an illumination effect of the virtual object according to the first illumination information, the position of the virtual object on the real scene picture and/or the current visual angle of the user.
2. The method according to claim 1, wherein the generating the lighting effect of the virtual object according to the first lighting information and the position of the virtual object in the picture of the real scene comprises:
generating diffuse reflection information of the real scene picture according to the first illumination information;
and generating the illumination effect of the virtual object according to the position of the virtual object in the picture of the real scene and the diffuse reflection information.
3. The method of claim 2, wherein generating diffuse reflectance information of the real scene from the first lighting information comprises:
determining the normal direction of the real scene picture at a preset position;
and calculating diffuse reflection illumination information of the pixels in the real scene picture in the normal direction according to the first illumination information.
4. The method according to claim 2, wherein the generating the lighting effect of the virtual object according to the position of the virtual object on the real scene picture and the diffuse reflection information comprises:
determining the normal direction of the virtual object according to the position of the virtual object in the picture of the real scene;
acquiring diffuse reflection illumination information of the virtual object from the diffuse reflection information of the real scene according to the normal direction of the virtual object;
and generating the illumination effect of the virtual object according to the diffuse reflection information of the virtual object.
5. The method of claim 1, wherein the holder of the AR helmet is slidably engaged with the lens, the lens being provided with a mounting plate, the holder being mounted on the mounting plate, the mounting plate being provided with a plurality of rollers at uniform intervals along a width thereof, the holder having a locking structure for locking the guide sleeve and the rollers.
6. The method of claim 5, wherein the locking structure of the AR helmet comprises a return spring and a sleeve and a threaded sleeve that are bilaterally symmetric about and disposed below a guide sleeve, the upper parts of the inner ends of the sleeve and the threaded sleeve are provided with first locking parts matched with the outer wall of the lower part of the guide sleeve in size, the lower parts of the inner ends of the sleeve and the thread sleeve are provided with second locking parts matched with the size of the roller, the inner end of the sleeve is provided with a first spring groove, the inner end of the threaded sleeve is provided with a second spring groove, one end of the return spring is arranged in the first spring groove, the other end of the return spring is arranged in the second spring groove, the sleeve and the threaded sleeve are internally provided with a second bolt, the sleeve and the threaded sleeve are connected through the second bolt and a locking nut matched with the second bolt, and at least one end part of the second bolt is provided with a second screwing piece.
7. The method as claimed in claim 1, wherein the pressing end of the AR helmet is extended with a plurality of support bars, the end of each support bar is provided with a support point connected with the rear shell of the mobile phone, the support bar is provided with a micro fan, the micro fan is provided with a touch switch, the support bar is provided with at least one through hole, a driving member made of shape memory alloy is installed in the through hole, one end of the driving member is connected with the touch switch, the other end of the driving member abuts against the rear shell of the mobile phone, the driving member is in a martensite state when the temperature of the rear shell of the mobile phone reaches an early warning value, the micro fan is turned on through the touch switch, the driving member is in an austenite state when the temperature of the rear shell of the mobile phone is lower than the early warning value;
the base plate is provided with a groove matched with the first screwing piece, and the first screwing piece is located in the groove.
CN201711072544.9A 2017-11-01 2017-11-01 Method and device for generating real illumination in augmented reality scene Active CN107749076B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711072544.9A CN107749076B (en) 2017-11-01 2017-11-01 Method and device for generating real illumination in augmented reality scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711072544.9A CN107749076B (en) 2017-11-01 2017-11-01 Method and device for generating real illumination in augmented reality scene

Publications (2)

Publication Number Publication Date
CN107749076A CN107749076A (en) 2018-03-02
CN107749076B true CN107749076B (en) 2021-04-20

Family

ID=61252931

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711072544.9A Active CN107749076B (en) 2017-11-01 2017-11-01 Method and device for generating real illumination in augmented reality scene

Country Status (1)

Country Link
CN (1) CN107749076B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110392251B (en) * 2018-04-18 2021-07-16 广景视睿科技(深圳)有限公司 Dynamic projection method and system based on virtual reality
CN108614638B (en) * 2018-04-23 2020-07-07 太平洋未来科技(深圳)有限公司 AR imaging method and apparatus
CN108898675A (en) * 2018-06-06 2018-11-27 微幻科技(北京)有限公司 A kind of method and device for adding 3D virtual objects in virtual scene
WO2020019132A1 (en) * 2018-07-23 2020-01-30 太平洋未来科技(深圳)有限公司 Method and apparatus for rendering virtual object on the basis of light information, and electronic device
WO2020029178A1 (en) * 2018-08-09 2020-02-13 太平洋未来科技(深圳)有限公司 Light and shadow rendering method and device for virtual object in panoramic video, and electronic apparatus
WO2020056689A1 (en) * 2018-09-20 2020-03-26 太平洋未来科技(深圳)有限公司 Ar imaging method and apparatus and electronic device
CN109445598B (en) * 2018-11-07 2022-04-15 深圳珑璟光电技术有限公司 Augmented reality system device based on vision
CN111639613B (en) * 2020-06-04 2024-04-16 上海商汤智能科技有限公司 Augmented reality AR special effect generation method and device and electronic equipment
IT202000020302A1 (en) 2020-08-24 2022-02-24 Youbiquo SYSTEM AND METHOD FOR MEASURING THE DIRECTION AND PREVALENT COLOR OF LIGHT INCIDENT ON A PERSON
CN113552942A (en) * 2021-07-14 2021-10-26 海信视像科技股份有限公司 Method and equipment for displaying virtual object based on illumination intensity

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103995700A (en) * 2014-05-14 2014-08-20 无锡梵天信息技术股份有限公司 Method for achieving global illumination of 3D game engine
CN106950693A (en) * 2017-03-24 2017-07-14 厦门轻游信息科技有限公司 A kind of the interaction helmet and its application based on MR mixed reality technologies

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100594519C (en) * 2008-03-03 2010-03-17 北京航空航天大学 Method for real-time generating reinforced reality surroundings by spherical surface panoramic camera
CN101520904B (en) * 2009-03-24 2011-12-28 上海水晶石信息技术有限公司 Reality augmenting method with real environment estimation and reality augmenting system
CN101710429B (en) * 2009-10-12 2012-09-05 湖南大学 Illumination algorithm of augmented reality system based on dynamic light map
CN106575450B (en) * 2014-05-13 2019-07-26 河谷控股Ip有限责任公司 It is rendered by the augmented reality content of albedo model, system and method
US9551873B2 (en) * 2014-05-30 2017-01-24 Sony Interactive Entertainment America Llc Head mounted device (HMD) system having interface with mobile computing device for rendering virtual reality content
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system
CN206283563U (en) * 2016-10-25 2017-06-27 马建东 A kind of mobile phone clamping type VR shows
CN106940897A (en) * 2017-03-02 2017-07-11 苏州蜗牛数字科技股份有限公司 A kind of method that real shadow is intervened in AR scenes
CN107092355B (en) * 2017-04-07 2023-09-22 北京小鸟看看科技有限公司 Method, device and system for controlling content output position of mobile terminal in VR (virtual reality) headset

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103995700A (en) * 2014-05-14 2014-08-20 无锡梵天信息技术股份有限公司 Method for achieving global illumination of 3D game engine
CN106950693A (en) * 2017-03-24 2017-07-14 厦门轻游信息科技有限公司 A kind of the interaction helmet and its application based on MR mixed reality technologies

Also Published As

Publication number Publication date
CN107749076A (en) 2018-03-02

Similar Documents

Publication Publication Date Title
CN107749076B (en) Method and device for generating real illumination in augmented reality scene
CN107845132B (en) Rendering method and device for color effect of virtual object
CN107871339B (en) Rendering method and device for color effect of virtual object in video
CN107705353B (en) Rendering method and device for virtual object shadow effect applied to augmented reality
CN107749075B (en) Method and device for generating shadow effect of virtual object in video
Chen et al. An overview of augmented reality technology
CN102999160B (en) The disappearance of the real-world object that user controls in mixed reality display
KR102005106B1 (en) System and method for augmented and virtual reality
US10559121B1 (en) Infrared reflectivity determinations for augmented reality rendering
US8768141B2 (en) Video camera band and system
US9734633B2 (en) Virtual environment generating system
CN105163268B (en) Glasses type communication device, system and method
EP2410733A2 (en) Camera system and method of displaying photos
KR20230044041A (en) System and method for augmented and virtual reality
KR20150090183A (en) System and method for generating 3-d plenoptic video images
CN108377398A (en) Based on infrared AR imaging methods, system and electronic equipment
CN107728787B (en) Information display method and device in panoramic video
CN106843473B (en) AR-based children painting system and method
CN114092671A (en) Virtual live broadcast scene processing method and device, storage medium and electronic equipment
KR102140077B1 (en) Master device, slave device and control method thereof
CN111918114A (en) Image display method, image display device, display equipment and computer readable storage medium
CN113194329B (en) Live interaction method, device, terminal and storage medium
WO2020250106A1 (en) A system and a method for teleportation for enhanced audio-visual interaction in mixed reality (mr) using a head mounted device (hmd)
US11924541B2 (en) Automatic camera exposures for use with wearable multimedia devices
US11533351B2 (en) Efficient delivery of multi-camera interactive content

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant