CN109840948B - Target object throwing method and device based on augmented reality - Google Patents

Target object throwing method and device based on augmented reality Download PDF

Info

Publication number
CN109840948B
CN109840948B CN201711226533.1A CN201711226533A CN109840948B CN 109840948 B CN109840948 B CN 109840948B CN 201711226533 A CN201711226533 A CN 201711226533A CN 109840948 B CN109840948 B CN 109840948B
Authority
CN
China
Prior art keywords
target object
image
data
dimensional model
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711226533.1A
Other languages
Chinese (zh)
Other versions
CN109840948A (en
Inventor
李炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inlife Handnet Co Ltd
Original Assignee
Inlife Handnet Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inlife Handnet Co Ltd filed Critical Inlife Handnet Co Ltd
Priority to CN201711226533.1A priority Critical patent/CN109840948B/en
Publication of CN109840948A publication Critical patent/CN109840948A/en
Application granted granted Critical
Publication of CN109840948B publication Critical patent/CN109840948B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses a target object throwing method and device based on augmented reality. According to the method and the device for throwing the target object, three-dimensional reconstruction is carried out on the target object, three-dimensional information of the target object is obtained, then the normal direction of any position of the model is obtained through a three-dimensional data model, and the obtained normal data and related data of the image are used for calculating the illumination direction so as to throw the light source and the target object at corresponding positions in a scene, so that the simple and effective method and the device for throwing the target object for augmented reality are obtained.

Description

Target object throwing method and device based on augmented reality
Technical Field
The present invention relates to the field of virtual reality technologies, and in particular, to a method and an apparatus for determining an illumination direction of a target object in augmented reality.
Background
AR (Augmented Reality ) is a new technology for integrating real world information and virtual world information in a "seamless" manner, and is that physical information (visual information, sound, taste, touch, etc.) which is hardly experienced in a certain time-space range of the real world is simulated by a scientific technology such as a computer, and then the real environment and the virtual object are superimposed in real time on the same picture or space and exist at the same time, and virtual information is applied to the real world and perceived by human senses, so as to achieve a sensory experience exceeding that of reality. The augmented reality technology not only displays real world information, but also simultaneously displays virtual information, and the two information are mutually supplemented and overlapped. In visual augmented reality, a user, using a head mounted display, multiplexes the real world with computer graphics so that the real world can be seen around it. The AR is a brand new mode of visual operation and interaction of people on complex data through a computer, and compared with a traditional human-computer interface and popular window operation, the AR has a qualitative leap in technical ideas.
The problem that often faces in augmented reality devices is illumination simulation, and traditional illumination direction estimation methods are based on image analysis of illumination directions, and the method has difficulty in illumination direction analysis because the normal direction of an object cannot be obtained from an image. Therefore, there is a need for a method for determining the illumination direction of a target object based on augmented reality.
Disclosure of Invention
The embodiment of the invention provides a method and a device for determining the illumination direction of a target object in augmented reality, which can simply and effectively determine the illumination direction of the target object in augmented reality.
According to one aspect of the invention, the invention provides a target object throwing method based on augmented reality, which comprises the following steps:
s101, obtaining a three-dimensional model of the target object, and obtaining related data of the three-dimensional model according to the three-dimensional model;
s102, acquiring an image, and acquiring related data of the image according to the image and the related data of the three-dimensional model;
s103, determining the illumination direction of the target object through the related data of the three-dimensional model and the related data of the image;
s104, putting a light source at a corresponding position in the scene according to the illumination direction of the target object, and putting the target object in the scene according to the related data of the image.
The invention provides a target object throwing method and device based on augmented reality, wherein the target object throwing method and device are used for obtaining three-dimensional information of a target object through three-dimensional reconstruction of the target object, then obtaining a normal direction of any position of a model through a three-dimensional data model, and then calculating an illumination direction by utilizing obtained normal data and related data of the image so as to throw a light source and the target object at corresponding positions in a scene, so that a simple and effective target object throwing method and device based on augmented reality is obtained.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a method for delivering a target object based on augmented reality according to an embodiment of the present invention.
Fig. 2 is an application scene schematic diagram of a launching device based on a target object in augmented reality according to an embodiment of the present invention.
Fig. 3 is a schematic structural diagram of a launching device based on an object in augmented reality according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to fall within the scope of the invention.
The terms first, second, third and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the objects so described may be interchanged where appropriate. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion.
In this patent document, the drawings discussed below and the various embodiments used to describe the principles of the present disclosure are by way of illustration only and should not be construed to limit the scope of the present disclosure. Those skilled in the art will understand that the principles of the present invention may be implemented in any suitably arranged device. Exemplary embodiments will be described in detail, examples of which are illustrated in the accompanying drawings. Further, a terminal according to an exemplary embodiment will be described in detail with reference to the accompanying drawings. Like reference symbols in the drawings indicate like elements.
The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The use of expressions in the singular encompasses plural forms of expressions unless the context clearly dictates otherwise. In the present description, it should be understood that terms such as "comprising," "having," "including," and "containing" are intended to specify the presence of the stated features, integers, steps, actions, or combinations thereof disclosed in the present description, but are not intended to preclude the presence or addition of one or more other features, integers, steps, actions, or combinations thereof. Like reference numerals in the drawings refer to like parts.
The embodiment of the invention provides a method and a device for throwing a target object based on augmented reality. Each of which will be described in detail below.
In a preferred embodiment, a method and an apparatus for delivering a target object based on augmented reality are provided, as shown in fig. 1, the flow may be as follows:
s101, obtaining a three-dimensional model of the target object, and obtaining relevant data of the three-dimensional model according to the three-dimensional model.
Specifically, a three-dimensional scanner, a binocular stereo and an AUTODESK 123D technology can be adopted to obtain a three-dimensional model of the target object; the method for acquiring the three-dimensional model of the target object is a three-dimensional reconstruction method based on the picture of the target object.
For example, the target object is three-dimensionally reconstructed by adopting an AUTODESK 123D technology, and the three-dimensional reconstruction of the target object can be realized by taking a plurality of pictures of the target object.
And acquiring a normal line of a surface point of the target object after acquiring the three-dimensional model of the target object, and calculating to obtain a first derivative of the normal line.
S102, acquiring an image, and acquiring related data of the image according to the image and the related data of the three-dimensional model.
In an embodiment of the present invention, the step of acquiring an image and related data of the image includes:
acquiring an image through a camera;
obtaining the pose relationship between the camera and the target object according to the mark information in the image;
and projecting according to the pose relation between the camera and the target object and the related data of the three-dimensional model of the target object, and obtaining the coordinates of the image corresponding to the three-dimensional model.
This step is to determine specific coordinates of the target object to be projected in the real scene, and provides a basis for transmitting the target object in the real scene in the next step.
S103, determining the illumination direction of the target object through the related data of the three-dimensional model and the related data of the image.
In an embodiment of the present invention, the step of determining the illumination direction of the target object by the related data of the three-dimensional model and the related data of the image includes:
filtering the image;
obtaining brightness data of the image from the image according to the coordinates of the image;
and calculating the illumination direction of the target object by using the brightness data of the image and the first derivative of the normal line of the target object.
The usual illumination direction estimation method is to analyze continuous images, and thus the calculation amount is huge. In order to meet the real-time requirement of the augmented reality target object device, in the invention, a discrete point analysis method is adopted, the operation amount is reduced, through a plurality of experiments, 40-60 sampling points are finally determined to be uniformly selected on the three-dimensional model of the target object in the embodiment of the invention, and the brightness value of the sampling points can be directly extracted from the image through the image coordinates of the sampling points, so that the illumination direction of the target object is calculated.
S104, putting a light source at a corresponding position in the scene according to the illumination direction of the target object, and putting the target object in the scene according to the related data of the image.
In the embodiment of the invention, the step of throwing the light source at the corresponding position in the scene according to the illumination direction of the target object and throwing the target object in the scene according to the related data of the image comprises the following steps:
putting a light source at a corresponding position in a scene according to the illumination direction of the target object;
and placing a target object in the scene according to the pose relation of the camera and the target object.
Before the target object is put in the scene according to the pose relation of the camera and the target object, the purpose of checking whether the illumination direction of the target object is accurate or not can be achieved by detecting the image of the pre-put target object.
The specific steps of the checking stage are as follows:
acquiring an image of the pre-cast target object;
acquiring brightness data on the pre-cast target object image;
comparing the brightness of the image of the pre-throwing target object with that of the image obtained by the camera, and throwing the image if the brightness value of the two images is within 5%; if the difference between the brightness values of the two images is greater than 5%, returning to the step S101.
As can be seen from the above, the present invention provides a method for launching a target object based on augmented reality, where the method for launching a target object acquires three-dimensional information of a target object by performing three-dimensional reconstruction on the target object, then acquires a normal direction of any position of a model through a three-dimensional data model, and then calculates an illumination direction by using the acquired normal data and related data of the image, so as to launch a light source and the target object at corresponding positions in a scene, thereby obtaining a simple and effective method and apparatus for launching a target object of augmented reality.
Referring to fig. 2, a video photographing apparatus based on virtual reality is provided in still another embodiment of the present invention. As shown in the figure, the video photographing apparatus includes: a photographing device 33, a server 34, and a control display device 36.
The photographing device 33 may be a video camera, a still camera, or an electronic device with a photographing function, and may be used to collect image information. For example, the photographing apparatus may be used for limb movements, expressions, language information, and the like of a user.
Server 34 may be a data server, a network server, or other network device. The server 34 may be used to provide three-dimensional images of prop model 341 and virtual scene 342.
The display control device 36 may include a computer, a smart phone, a tablet computer, or the like having an arithmetic processing function.
In yet another embodiment, an application scenario diagram of a launching device based on a target object in augmented reality is provided. The following describes the target object throwing method in detail by taking the prop model 22 as the target object solid model 22 as an example based on the target object device.
In this embodiment, the scanning device 21 (e.g. three-dimensional scanner, binocular stereo) is used to exchange the three-dimensional model of the target object entity 22, and the relevant data of the three-dimensional model is obtained according to the three-dimensional model. Then, acquiring an image of the target object 22 and related data of the image of the target object 22 by a camera 23; the illumination direction of the target object 22 is calculated from the relevant data of the three-dimensional model and the relevant data of the image of the target object 22, and finally a light source is placed at the relevant position of the illumination direction of the target object 22 in the scene, and an image 25 is put at the corresponding position in the scene.
As shown in fig. 3, user 31 is doing a weapon-firing motion to monster model 341, and server 34 will generate a corresponding light wave (i.e., virtual scene 342) based on the limb motion of user 31. The monster model 341 will exhibit a knocked down condition based on the action of the user 31 firing the weapon. The photographing device 33 captures information of the limb motion, expression, language and the like of the user 31, transmits the information to the display control terminal 36 in real time for display, and the server 34 transmits the virtual scene 342 and the monster model 341 to the display control terminal 36 in real time for display based on the limb motion of the user 31, so as to obtain the fused image 35.
In practical application, in order to improve the reality of the scene, a fusion device can be added in the augmented reality device, and the fusion device is used for rendering and fusing the thrown target object image and the display scene.
It can be seen from the above that, according to the method for throwing the target object based on the augmented reality provided by the embodiment of the invention, the three-dimensional model is built for the target object to be thrown so as to obtain the related data of the three-dimensional model, so that the accuracy of the illumination direction of the target object in the augmented reality is improved, and the throwing effect is enhanced.
In still another embodiment of the present invention, a launching device of a target object based on augmented reality is also provided. As shown in fig. 3, the virtual reality-based video capturing apparatus may include a target module 31, a scanning module 32, a target acquisition module 33, an illumination direction calculation module 34, and a delivery module 35, where:
a target module 31 for providing a target object solid model;
a scanning module 32 for scanning the target module and providing three-dimensional data of the target module;
a target acquisition module 33 for acquiring image information data of the target module;
and the illumination direction calculation module 34 is used for acquiring the light source direction of the target object according to the image information data and the three-dimensional data of the target module.
A throwing module 35 for placing a light source at a corresponding position in the scene according to the light source direction of the target object, and throwing the target object in the scene according to the image information data and the three-dimensional data
In some embodiments, the scanning module 32 may include a data creation unit 321, an entity scanning unit 322, and a data processing unit 323, where:
a data establishing unit 321, configured to establish an information database of the target model;
the entity scanning unit 322 is configured to acquire a three-dimensional model and three-dimensional scanning model data of the target model and store the three-dimensional model and the three-dimensional scanning model data in the information database;
a data processing unit 323 for calculating a normal of the three-dimensional model and a first derivative of the normal from the three-dimensional model.
In some embodiments, the target acquisition module 33 may include an image acquisition unit 331, an image processing unit 332, and a coordinate calculation unit 333.
An image acquisition unit 331 for acquiring image information of a target module;
an image processing unit 332, configured to calculate a pose relationship between the image acquisition unit and the target module according to the flag information in the image;
and a coordinate calculating unit 333, configured to perform projection according to the pose relationship between the image acquiring unit and the target module and the scan data of the three-dimensional model, and obtain the image coordinate corresponding to the three-dimensional model through processing.
In some embodiments, the direction calculation unit 34 may include a luminance data unit 341 and an illumination direction determination unit 342.
A luminance data unit 341, configured to extract luminance data of the image from the image;
the illumination direction determining unit 342 is configured to determine an illumination direction of the target module according to the brightness data of the image and the first order reciprocal of the normal.
In some embodiments, the delivery module 35 includes a fusion unit 351 and a delivery unit 352.
A fusion unit 351 for fusing the rendered image with the real scene;
a dropping unit 352 for dropping the image and the light source at corresponding positions in the display scene
As can be seen from the foregoing, the embodiment of the present invention obtains a target object throwing apparatus based on augmented reality, by performing three-dimensional reconstruction on a target module to obtain three-dimensional information of the target object, then obtaining a normal direction of any position of a model through a three-dimensional data model, and then calculating an illumination direction by using the obtained normal data and related data of the image, so as to throw a light source and the target object in corresponding positions in a scene, thereby obtaining a simple and effective target object throwing method and apparatus for augmented reality.
The use of the terms "a" and "an" and "the" and similar referents in the context of describing the concepts of the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural. Furthermore, unless otherwise indicated herein, recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein. In addition, the steps of all methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The changes of the invention are not limited to the order of the steps described. The use of any and all examples, or exemplary language (e.g., "such as") provided herein, is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. Numerous modifications and adaptations will be readily apparent to those skilled in the art without departing from the spirit and scope.
The method and the device for throwing the target object based on the augmented reality provided by the embodiment of the invention are described in detail. It should be understood that the exemplary embodiments described herein are to be considered merely descriptive for aiding in the understanding of the method of the present invention and its core concepts and not for limiting the invention. The description of features or aspects in each of the exemplary embodiments should generally be construed as applicable to similar features or aspects in other exemplary embodiments. While the present invention has been described with reference to exemplary embodiments, various changes and modifications may be suggested to one skilled in the art. The present invention is intended to embrace those changes and modifications that fall within the scope of the appended claims.

Claims (10)

1. The method for throwing the target object based on the augmented reality is characterized by comprising the following steps of:
s101, obtaining a three-dimensional model of the target object, and obtaining related data of the three-dimensional model according to the three-dimensional model;
s102, acquiring an image, and acquiring related data of the image according to the image and the related data of the three-dimensional model;
s103, determining the illumination direction of the target object through the related data of the three-dimensional model and the related data of the image;
s104, a light source is put in a corresponding position in the scene according to the illumination direction of the target object, the image of the target object is detected to judge whether the illumination direction of the target object is accurate, and if the illumination direction of the target object is accurate, the target object is put in the scene according to the related data of the image.
2. The augmented reality-based target object delivery method according to claim 1, wherein the step of obtaining a three-dimensional model of the target object and obtaining relevant data of the three-dimensional model from the three-dimensional model comprises:
scanning the target object to obtain a three-dimensional model of the target object;
and calculating the normal line of the target object and the first derivative of the normal line of the target object according to the three-dimensional model of the target object.
3. The augmented reality-based object delivery method according to claim 2, wherein the step of acquiring an image and obtaining the relevant data of the image from the relevant data of the image and the three-dimensional model comprises:
acquiring an image through a camera;
obtaining the pose relationship between the camera and the target object according to the mark information in the image;
and projecting according to the pose relation between the camera and the target object and the related data of the three-dimensional model of the target object, and obtaining the coordinates of the image corresponding to the three-dimensional model.
4. The augmented reality-based target object delivery method according to claim 3, wherein determining the illumination direction of the target object from the three-dimensional model related data and the image related data comprises:
filtering the image;
obtaining brightness data of the image from the image according to the coordinates of the image;
and calculating the illumination direction of the target object by using the brightness data of the image and the first derivative of the normal line of the target object.
5. The augmented reality-based object delivery method of claim 4, wherein delivering the object in the scene based on the image-related data comprises:
and placing a target object in the scene according to the pose relation of the camera and the target object.
6. An augmented reality-based target object delivery device, comprising:
the target module is used for providing a solid model for throwing targets;
the scanning module is used for scanning the target module and providing three-dimensional data of the target module;
the target acquisition module is used for acquiring the image information data of the target module;
the direction calculation module is used for acquiring the light source direction of the target object according to the image information data and the three-dimensional data of the target module;
and the throwing module is used for placing a light source at a corresponding position in the scene according to the light source direction of the target object, detecting the image of the target object to judge whether the illumination direction of the target object is accurate, and throwing the target object in the scene according to the image information data and the three-dimensional data if the illumination direction of the target object is accurate.
7. The augmented reality-based target object delivery device of claim 6, wherein the scanning module comprises:
the data establishing unit is used for establishing an information database of the target module;
the entity scanning unit is used for acquiring the three-dimensional model of the target module and the scanning data of the three-dimensional model and storing the three-dimensional model and the scanning data in the information database;
and the data processing unit is used for calculating the normal line of the three-dimensional model and the first derivative of the normal line according to the three-dimensional model.
8. The augmented reality-based target object delivery device of claim 7, wherein the target acquisition module comprises:
the image acquisition unit is used for acquiring the image information of the target module;
the image processing unit is used for calculating the pose relation between the image acquisition unit and the target module according to the mark information in the image;
and the coordinate calculation unit is used for projecting according to the pose relation of the image acquisition unit and the target module and the scanning data of the three-dimensional model, and processing to obtain the image coordinates corresponding to the three-dimensional model.
9. The augmented reality-based target object delivery apparatus according to claim 8, wherein the direction calculation unit includes:
a luminance data unit for extracting luminance data of the image from the image;
and the illumination direction determining unit is used for determining the illumination direction of the target module according to the brightness data of the image and the first-order reciprocal of the normal.
10. The augmented reality-based target object delivery device of claim 6, wherein the delivery module comprises:
the fusion unit is used for fusing the rendered image with the real scene;
and the throwing unit is used for throwing the image and the light source at the corresponding position in the display scene.
CN201711226533.1A 2017-11-29 2017-11-29 Target object throwing method and device based on augmented reality Active CN109840948B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711226533.1A CN109840948B (en) 2017-11-29 2017-11-29 Target object throwing method and device based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711226533.1A CN109840948B (en) 2017-11-29 2017-11-29 Target object throwing method and device based on augmented reality

Publications (2)

Publication Number Publication Date
CN109840948A CN109840948A (en) 2019-06-04
CN109840948B true CN109840948B (en) 2023-08-15

Family

ID=66882043

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711226533.1A Active CN109840948B (en) 2017-11-29 2017-11-29 Target object throwing method and device based on augmented reality

Country Status (1)

Country Link
CN (1) CN109840948B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113763090B (en) * 2020-11-06 2024-05-21 北京沃东天骏信息技术有限公司 Information processing method and device
CN112991556B (en) * 2021-05-12 2022-05-27 航天宏图信息技术股份有限公司 AR data display method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100962557B1 (en) * 2009-02-05 2010-06-11 한국과학기술원 Augmented reality implementation apparatus and method of the same
CN103218854A (en) * 2013-04-01 2013-07-24 成都理想境界科技有限公司 Method for realizing component marking during augmented reality process and augmented reality system
CN106981087A (en) * 2017-04-05 2017-07-25 杭州乐见科技有限公司 Lighting effect rendering intent and device
CN107071388A (en) * 2016-12-26 2017-08-18 深圳增强现实技术有限公司 A kind of three-dimensional augmented reality display methods and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100962557B1 (en) * 2009-02-05 2010-06-11 한국과학기술원 Augmented reality implementation apparatus and method of the same
CN103218854A (en) * 2013-04-01 2013-07-24 成都理想境界科技有限公司 Method for realizing component marking during augmented reality process and augmented reality system
CN107071388A (en) * 2016-12-26 2017-08-18 深圳增强现实技术有限公司 A kind of three-dimensional augmented reality display methods and device
CN106981087A (en) * 2017-04-05 2017-07-25 杭州乐见科技有限公司 Lighting effect rendering intent and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
增强现实虚拟人中的光照方向估计;杜书侠等;《光学技术》;20161115(第06期);第482页 *

Also Published As

Publication number Publication date
CN109840948A (en) 2019-06-04

Similar Documents

Publication Publication Date Title
CN106355153B (en) A kind of virtual objects display methods, device and system based on augmented reality
US9654734B1 (en) Virtual conference room
US10692288B1 (en) Compositing images for augmented reality
US9392248B2 (en) Dynamic POV composite 3D video system
CN112198959A (en) Virtual reality interaction method, device and system
CN106464773B (en) Augmented reality device and method
KR101961758B1 (en) 3-Dimensional Contents Providing System, Method and Computer Readable Recoding Medium
WO2022022029A1 (en) Virtual display method, apparatus and device, and computer readable storage medium
WO2010038693A1 (en) Information processing device, information processing method, program, and information storage medium
CN110866977A (en) Augmented reality processing method, device and system, storage medium and electronic equipment
CN107167077B (en) Stereoscopic vision measuring system and stereoscopic vision measuring method
CN109035415B (en) Virtual model processing method, device, equipment and computer readable storage medium
CN111696215A (en) Image processing method, device and equipment
CN109640070A (en) A kind of stereo display method, device, equipment and storage medium
Reimat et al. Cwipc-sxr: Point cloud dynamic human dataset for social xr
CN112419388A (en) Depth detection method and device, electronic equipment and computer readable storage medium
CN109840948B (en) Target object throwing method and device based on augmented reality
KR20180120456A (en) Apparatus for providing virtual reality contents based on panoramic image and method for the same
Zerman et al. User behaviour analysis of volumetric video in augmented reality
US10582190B2 (en) Virtual training system
CN109407824B (en) Method and device for synchronous motion of human body model
TW201508547A (en) Interactive reality system and method featuring the combination of on-site reality and virtual component
CN113706720A (en) Image display method and device
CN112073632A (en) Image processing method, apparatus and storage medium
CN109829960A (en) A kind of VR animation system interaction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant