CN106296348A - The indoor scene analog systems realized based on virtual reality method and method - Google Patents

The indoor scene analog systems realized based on virtual reality method and method Download PDF

Info

Publication number
CN106296348A
CN106296348A CN201610628380.2A CN201610628380A CN106296348A CN 106296348 A CN106296348 A CN 106296348A CN 201610628380 A CN201610628380 A CN 201610628380A CN 106296348 A CN106296348 A CN 106296348A
Authority
CN
China
Prior art keywords
virtual reality
helmet
video camera
reflective spot
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610628380.2A
Other languages
Chinese (zh)
Inventor
陈涛
许博凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201610628380.2A priority Critical patent/CN106296348A/en
Publication of CN106296348A publication Critical patent/CN106296348A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/16Real estate

Abstract

The invention discloses a kind of indoor scene analog systems realized based on virtual reality method and method thereof, build module including helmet, reflective spot Marker, video camera and virtual scene based on Unity or UE4;Described helmet includes that processor and screen display, described processor are connected with screen display, and described screen display is identified for projecting, by screen display, the virtual reality scenario being made up of 3D model facing to the eyes of user, described helmet in the eyes of user;Described reflective spot Marker is arranged on described helmet;Described video camera for catching the reflective spot Marker on helmet, for catch user position and towards, and keep real-time tracing;The risk increasing house-purchase of the prior art and defect to the satisfaction in house from now on is avoided in conjunction with its method.

Description

The indoor scene analog systems realized based on virtual reality method and method
Technical field
The present invention relates to the technical field of a kind of indoor Virtual Space analog systems, be specifically related to a kind of based on virtual reality The indoor scene analog systems of method realization and method thereof, especially relate to virtual scene show house based on virtual reality method Display systems and method thereof.
Background technology
China in the past few decades in quickly grow, particularly urban construction speed hits new peak repeatly.Along with urbanization Process, population in all parts of the country shifts to several big cities in a large number, and building construction one is directly subordinate to popular industry.But traditional premises Industry has various limitation and progressive space.Wherein one is exactly that the show house in real estate's sales chain is shown.Traditional house property Enterprise only could sell to potential customers after show house completes finishing in effectively beginning.The finishing of show house was both Time-consuming expensive again.Substantially increase up-front investment, and postpone sale.After selling period, leave over the process of show house also It it is directly a problem.Show house variability is poor, has once fitted up and has almost been difficult to change.
In tradition Model of Buying House, what show house was house purchaser to oneself following residence considers index the most intuitively.But mirror Under the house property situation that supply falls short of demand of domestic a lot of cities, if it is desired to the first chance accounting for purchase house, a lot of house purchasers are at show house It is accomplished by the case of not building up making purchase decision.Increase risk and the asking the satisfaction in house from now on of house-purchase Topic.
Therefore it provides a kind of simulation indoor scene based on virtual reality mode that can solve the problems referred to above and real estate Show house method is the most necessary.
Summary of the invention
The technical problem to be solved is to provide a kind of indoor scene mould realized based on virtual reality method Intend system and method, it is to avoid the risk increasing house-purchase of the prior art and from now on defect to the satisfaction in house.
For solving above-mentioned technical problem, the technical solution of the present invention is:
A kind of based on virtual reality method realize indoor scene analog systems, including helmet, reflective spot Marker, Video camera and virtual scene based on Unity or UE4 build module;
Described helmet includes that processor and screen display, described processor are connected with screen display, and described screen display is facing to making The eyes of user, described helmet is for projecting the virtual reality scenario being made up of 3D model to user by screen display Eyes are identified;
Described reflective spot Marker is arranged on described helmet;
Described video camera is for catching the reflective spot Marker on helmet, for catching position and the court of user To, and keep real-time tracing;
Described virtual scene based on Unity or UE4 builds module and location information receiving module is embedded in described process In device, be used for building virtual reality scenario, described virtual reality scenario based on 3D Model Construction, and by UE4_3D engine or Unity3D engine completes to render.
The method of the described indoor scene analog systems realized based on virtual reality method, specific as follows:
First start the image of the cameras capture reflective spot Marker to helmet, then sent out by hub HUB Delivering in described intelligent terminal, described intelligent terminal just calculates seizure target, i.e. reflective spot by triangle polyester fibre algorithm Marker is in the locus of that time point, and passes through coordinate output module real-time Transmission target location to helmet Processor in;
Then location information receiving module follows the trail of the data raising rate of information throughput by compression, and processor just starts base Virtual scene in Unity or UE4 builds module real-time decoding and integrates the coordinate information received to build virtual reality field Scape, and the equal proportion completed in virtual scene moves imitation;
Described helmet is starting virtual scene based on Unity or UE4 structure module real-time decoding and is integrating reception To coordinate information build virtual reality scenario, by being presented to use in the face of the screen display of user eyes with the first visual angle Person.
When the picture of virtual field reality scene cannot keep enough frame per second, processor can also be distorted by asynchronous time Algorithm and startup gyroscope gather data and carry out the possible rotation and position of user head after predictive display screen refreshes, according to this The data of prediction render intermediate frame and make up.
By dummy model, actual indoor scene is simulated.Optical tracking video camera or photosensitive camera are by target On, the namely supervision of the reflective spot Marker of helmet and tracking, reach motion capture based on principle of computer vision.Light Learn tracking video camera or photosensitive camera may determine that it according to the image parameter of the reflective spot Marker captured in the same time Locus.By being continuously shot, the movement locus of reflective spot Marker can be obtained from image sequence, including distance.
Accompanying drawing explanation
Fig. 1 is the overall structure schematic diagram of the described indoor scene analog systems realized based on virtual reality method.
Fig. 2 is the structural representation of helmet.
Detailed description of the invention
In order to make the purpose of the present invention, technical scheme and advantage clearer, below in conjunction with drawings and Examples, right The present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, and It is not used in the restriction present invention.
As depicted in figs. 1 and 2, the indoor scene analog systems realized based on virtual reality method, including helmet, instead Luminous point 5, video camera and virtual scene based on Unity or UE4 build module;
Described helmet 1 includes processor 12 and screen display 14, and described processor is connected with screen display 14, described screen display 14 Facing to the eyes of user, described helmet 1 for projecting the virtual reality field being made up of 3D model by screen display 14 Scape is identified in the eyes of user;
Described reflective spot Marker is arranged on described helmet 1;
Described video camera 4 for catching the reflective spot Marker on helmet 1, for catch user position and Towards, and keep real-time tracing;
Described virtual scene based on Unity or UE4 builds module and location information receiving module is embedded in described process In device, be used for building virtual reality scenario, described virtual reality scenario based on 3D Model Construction, and by UE4_3D engine or Unity3D engine completes to render.Virtual reality scenario can be shown to user by typing helmet in advance.During use, in conjunction with taking the photograph User is visually placed oneself in the midst of the starting point in virtual scene by the real-time tracing of user coordinate by camera.Open from starting point Beginning, each position of user and visual angle are moved and can be captured by video camera, and directly change the screen of helmet Picture in Xian is for simulating user truly moving in virtual scene.
Described helmet 1 is virtual reality helmet.
Described helmet 1 also includes gyroscope and the wifi module that same processor is connected, and described wifi module can Substituted by bluetooth module.
Described video camera 4 is connected with intelligent terminal by hub HUB, and described intelligent terminal includes that coordinate exports mould Block, described video camera just constitutes dynamic tracing part, and the quantity of described video camera is more than six, described more than six take the photograph Camera arranges around show venue, and described camera pedestal is located on support, and the overlapping region of the camera perspective of video camera is with regard to structure Having become dynamically can capture region, additionally the arranging position at least three video cameras unified time to be ensured and can catch of video camera Reflective spot Marker on helmet.
Described video camera 4 is optical tracking video camera or photosensitive camera.
Described intelligent terminal is with wifi module or the PDA of bluetooth module, PC or mobile terminal.
Described screen display 14 is high definition screen display.
It is provided with remote controller 2 to described user.
Described reflective spot Marker includes main reflective spot 11 and auxiliary reflective spot 13, and described main reflective spot 11 is arranged on described Screen display top, and assist reflective spot 13 to be arranged on the length-adjustable headband that main reflective spot is other.
By dummy model, actual indoor scene is simulated.Optical tracking video camera or photosensitive camera are by target On, the namely supervision of the reflective spot Marker of helmet and tracking, reach motion capture based on principle of computer vision.Light Learn tracking video camera or photosensitive camera may determine that it according to the image parameter of the reflective spot Marker captured in the same time Locus.By being continuously shot, the movement locus of reflective spot Marker can be obtained from image sequence, including distance.
The method of the described indoor scene analog systems realized based on virtual reality method, specific as follows:
First start the image of the cameras capture reflective spot Marker to helmet, then sent out by hub HUB Delivering in described intelligent terminal, described intelligent terminal just calculates seizure target, i.e. reflective spot by triangle polyester fibre algorithm Marker is in the locus of that time point, and passes through coordinate output module real-time Transmission target location to helmet Processor in;
This indoor scene realized by virtual reality technology is simulated, and its helmet receives mould equipped with location information Block.Then location information receiving module is followed the trail of data by compression and is improved the rate of information throughput, and processor just start based on The coordinate information that the virtual scene of Unity or UE4 builds module real-time decoding and integration receives to build virtual reality scenario, And the equal proportion completed in virtual scene moves imitation, for location information set out side pressure contracting and receiving terminal decode, significantly The delay of the information that dips transmission, as little as 4 milliseconds;
Described helmet is starting virtual scene based on Unity or UE4 structure module real-time decoding and is integrating reception To coordinate information build virtual reality scenario, by being presented to use in the face of the screen display of user eyes with the first visual angle Person.
When the picture of virtual field reality scene cannot keep enough frame per second, processor can also be distorted by asynchronous time Algorithm and startup gyroscope gather data and carry out the possible rotation and position of user head after predictive display screen refreshes, according to this The data of prediction render intermediate frame and make up, thus maintain higher picture refreshing rate.
In described virtual scene based on UE4 structure module, UE4_3D engine have employed to render to add for virtual scene and calculates quickly Method is to reduce delay.
The reflective spot Marker that additionally video camera is caught uses Precision Spheres with 3M 7610 Reflective Tape and M4 Threads, its characteristic is that its optical properties is very different with natural environment.Light Inject, from all angles, the reflective amount that reflective spot (Marker) obtains and be attained by more than luminance factor 700x.High brightness coefficient is just Calibrating the position of reflective spot at the beginning in camera and keeping persistently following the tracks of.
The indoor scene realized by virtual reality technology is simulated, and itself Unity or UE4 virtual scene includes 3D model construction And Unity or UE4 engine effects renders.Render be greatly improved by 3D model being carried out particle effect, shade and illumination The verity of model.
With the above-mentioned desirable embodiment according to the present invention for enlightenment, by above-mentioned description, relevant staff is complete Entirely can carry out various change and amendment in the range of without departing from this invention technological thought.The technology of this invention The content that property scope is not limited in description, it is necessary to determine its technical scope according to right.

Claims (10)

1. the indoor scene analog systems realized based on virtual reality method, it is characterised in that include helmet, reflective Point Marker, video camera and virtual scene based on Unity or UE4 build module;
Described helmet includes that processor and screen display, described processor are connected with screen display, and described screen display is facing to user Eyes, described helmet for projecting the virtual reality scenario that is made up of the 3D model eyes to user by screen display In be identified;
Described reflective spot Marker is arranged on described helmet;
Described video camera for catching the reflective spot Marker on helmet, for catch user position and towards, And keep real-time tracing;
Described virtual scene based on Unity or UE4 builds module and location information receiving module is embedded in described processor, Being used for building virtual reality scenario, described virtual reality scenario is based on 3D Model Construction, and by UE4_3D engine or Unity3D Engine completes to render.Described helmet is virtual reality helmet.
The indoor scene analog systems realized based on virtual reality method the most according to claim 1, it is characterised in that institute Stating gyroscope and wifi module that helmet also includes that same processor is connected, described wifi module can be by bluetooth module institute Substitute.
The indoor scene analog systems realized based on virtual reality method the most according to claim 1, it is characterised in that institute Stating video camera to be connected with intelligent terminal by hub HUB, described intelligent terminal includes coordinate output module, described video camera Just constituting dynamic tracing part, the quantity of described video camera is more than six, and the video camera of described more than six is around showing Place arranges, and described camera pedestal is located on support, and the overlapping region of the camera perspective of video camera just constitutes and dynamically can catch Catch region, additionally position at least three video cameras unified time to be ensured are set can the capturing on helmet of video camera Reflective spot Marker.
The indoor scene analog systems realized based on virtual reality method the most according to claim 1, it is characterised in that institute Stating video camera is optical tracking video camera or photosensitive camera.
The indoor scene analog systems realized based on virtual reality method the most according to claim 1, it is characterised in that institute State intelligent terminal for wifi module or the PDA of bluetooth module, PC or mobile terminal.
The indoor scene analog systems realized based on virtual reality method the most according to claim 1, it is characterised in that institute State intelligent terminal for wifi module or the PDA of bluetooth module, PC or mobile terminal.
The indoor scene analog systems realized based on virtual reality method the most according to claim 1, it is characterised in that give Described user is provided with remote controller, and the reflective spot Marker that described video camera is caught uses Precision Spheres with 3M 7610Reflective Tape and M4 Threads。
The indoor scene analog systems realized based on virtual reality method the most according to claim 1, it is characterised in that institute Stating reflective spot Marker and include main reflective spot and auxiliary reflective spot, described main reflective spot is arranged on described screen display top, and assists Reflective spot is arranged on the length-adjustable headband that main reflective spot is other.
The method of the indoor scene analog systems realized based on virtual reality method the most according to claim 1, its feature It is, specific as follows:
First start the image of the cameras capture reflective spot Marker to helmet, then be sent to by hub HUB In described intelligent terminal, described intelligent terminal just calculates seizure target by triangle polyester fibre algorithm, i.e. reflective spot Marker exists The locus of that time point, and by the processor of coordinate output module real-time Transmission target location to helmet In;
Then location information receiving module is followed the trail of data by compression and is improved the rate of information throughput, and processor just start based on The coordinate information that the virtual scene of Unity or UE4 builds module real-time decoding and integration receives to build virtual reality scenario, And the equal proportion completed in virtual scene moves imitation;
Described helmet receives at startup virtual scene based on Unity or UE4 structure module real-time decoding integration Coordinate information builds virtual reality scenario, by being presented to user in the face of the screen display of user eyes with the first visual angle.
When the picture of virtual field reality scene cannot keep enough frame per second, processor can also pass through asynchronous time warping algorithm Carry out the possible rotation and position of user head after predictive display screen refreshes with starting gyroscope collection data, predict according to this Data render intermediate frame and make up.
The method of the indoor scene analog systems realized based on virtual reality method the most according to claim 9, its feature Be described virtual scene based on UE4 build UE4_3D engine in module have employed for virtual scene render accelerating algorithm with Reduce and postpone.
CN201610628380.2A 2016-08-03 2016-08-03 The indoor scene analog systems realized based on virtual reality method and method Pending CN106296348A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610628380.2A CN106296348A (en) 2016-08-03 2016-08-03 The indoor scene analog systems realized based on virtual reality method and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610628380.2A CN106296348A (en) 2016-08-03 2016-08-03 The indoor scene analog systems realized based on virtual reality method and method

Publications (1)

Publication Number Publication Date
CN106296348A true CN106296348A (en) 2017-01-04

Family

ID=57664506

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610628380.2A Pending CN106296348A (en) 2016-08-03 2016-08-03 The indoor scene analog systems realized based on virtual reality method and method

Country Status (1)

Country Link
CN (1) CN106296348A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106254344A (en) * 2016-08-02 2016-12-21 天津奇幻岛科技有限公司 A kind of method expanding HTC vive architecture scope
CN106951882A (en) * 2017-03-31 2017-07-14 广州帕克西软件开发有限公司 The identification method for tracing and device of a kind of shape of face
CN107274472A (en) * 2017-06-16 2017-10-20 福州瑞芯微电子股份有限公司 A kind of method and apparatus of raising VR play frame rate
CN107507280A (en) * 2017-07-20 2017-12-22 广州励丰文化科技股份有限公司 Show the switching method and system of the VR patterns and AR patterns of equipment based on MR heads
CN107632704A (en) * 2017-09-01 2018-01-26 广州励丰文化科技股份有限公司 A kind of mixed reality audio control method and service equipment based on optical alignment
CN107632703A (en) * 2017-09-01 2018-01-26 广州励丰文化科技股份有限公司 Mixed reality audio control method and service equipment based on binocular camera
CN107704078A (en) * 2017-09-11 2018-02-16 广州慧玥文化传播有限公司 The method and system of MR patterns are realized based on optical alignment
CN108919940A (en) * 2018-05-15 2018-11-30 青岛大学 A kind of Virtual Campus Cruise System based on HTC VIVE
CN109240496A (en) * 2018-08-24 2019-01-18 中国传媒大学 A kind of acousto-optic interactive system based on virtual reality
CN109508319A (en) * 2018-11-09 2019-03-22 武汉兴联云立方科技有限公司 A kind of cross-platform resource management system of 3D and method
CN109529318A (en) * 2018-11-07 2019-03-29 艾葵斯(北京)科技有限公司 Virtual vision system
CN110489184A (en) * 2018-05-14 2019-11-22 北京凌宇智控科技有限公司 A kind of virtual reality scenario implementation method and its system based on UE4 engine
CN111652980A (en) * 2020-06-02 2020-09-11 西南石油大学 Complex virtual reality simulation system based on particle system
CN112070901A (en) * 2020-07-21 2020-12-11 马小淞 AR scene construction method and device for garden, storage medium and terminal

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101504583A (en) * 2008-12-22 2009-08-12 杨贻方 3D visible house viewing system
CN104765456A (en) * 2015-04-08 2015-07-08 成都爱瑞斯文化传播有限责任公司 Virtual space system and building method thereof
CN105445937A (en) * 2015-12-27 2016-03-30 深圳游视虚拟现实技术有限公司 Mark point-based multi-target real-time positioning and tracking device, method and system
TWI540534B (en) * 2015-02-26 2016-07-01 宅妝股份有限公司 Control system and method for virtual navigation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101504583A (en) * 2008-12-22 2009-08-12 杨贻方 3D visible house viewing system
TWI540534B (en) * 2015-02-26 2016-07-01 宅妝股份有限公司 Control system and method for virtual navigation
CN104765456A (en) * 2015-04-08 2015-07-08 成都爱瑞斯文化传播有限责任公司 Virtual space system and building method thereof
CN105445937A (en) * 2015-12-27 2016-03-30 深圳游视虚拟现实技术有限公司 Mark point-based multi-target real-time positioning and tracking device, method and system

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106254344A (en) * 2016-08-02 2016-12-21 天津奇幻岛科技有限公司 A kind of method expanding HTC vive architecture scope
CN106254344B (en) * 2016-08-02 2019-07-12 天津奇幻岛科技有限公司 A method of expanding HTC vive base station location range
CN106951882A (en) * 2017-03-31 2017-07-14 广州帕克西软件开发有限公司 The identification method for tracing and device of a kind of shape of face
CN106951882B (en) * 2017-03-31 2021-03-23 广州帕克西软件开发有限公司 Face shape recognition and tracking method and device
CN107274472A (en) * 2017-06-16 2017-10-20 福州瑞芯微电子股份有限公司 A kind of method and apparatus of raising VR play frame rate
CN107507280A (en) * 2017-07-20 2017-12-22 广州励丰文化科技股份有限公司 Show the switching method and system of the VR patterns and AR patterns of equipment based on MR heads
CN107632704A (en) * 2017-09-01 2018-01-26 广州励丰文化科技股份有限公司 A kind of mixed reality audio control method and service equipment based on optical alignment
CN107632703A (en) * 2017-09-01 2018-01-26 广州励丰文化科技股份有限公司 Mixed reality audio control method and service equipment based on binocular camera
CN107632704B (en) * 2017-09-01 2020-05-15 广州励丰文化科技股份有限公司 Mixed reality audio control method based on optical positioning and service equipment
CN107704078A (en) * 2017-09-11 2018-02-16 广州慧玥文化传播有限公司 The method and system of MR patterns are realized based on optical alignment
CN110489184B (en) * 2018-05-14 2023-07-25 北京凌宇智控科技有限公司 Virtual reality scene implementation method and system based on UE4 engine
CN110489184A (en) * 2018-05-14 2019-11-22 北京凌宇智控科技有限公司 A kind of virtual reality scenario implementation method and its system based on UE4 engine
CN108919940A (en) * 2018-05-15 2018-11-30 青岛大学 A kind of Virtual Campus Cruise System based on HTC VIVE
CN109240496A (en) * 2018-08-24 2019-01-18 中国传媒大学 A kind of acousto-optic interactive system based on virtual reality
CN109240496B (en) * 2018-08-24 2021-07-16 中国传媒大学 Acousto-optic interaction system based on virtual reality
CN109529318A (en) * 2018-11-07 2019-03-29 艾葵斯(北京)科技有限公司 Virtual vision system
CN109508319B (en) * 2018-11-09 2021-04-27 武汉兴联云立方科技有限公司 3D cross-platform resource management system and method
CN109508319A (en) * 2018-11-09 2019-03-22 武汉兴联云立方科技有限公司 A kind of cross-platform resource management system of 3D and method
CN111652980B (en) * 2020-06-02 2021-06-25 西南石油大学 Complex virtual reality simulation system based on particle system
CN111652980A (en) * 2020-06-02 2020-09-11 西南石油大学 Complex virtual reality simulation system based on particle system
CN112070901A (en) * 2020-07-21 2020-12-11 马小淞 AR scene construction method and device for garden, storage medium and terminal

Similar Documents

Publication Publication Date Title
CN106296348A (en) The indoor scene analog systems realized based on virtual reality method and method
JP6431233B1 (en) Video distribution system that distributes video including messages from viewing users
CN102540464B (en) Head-mounted display device which provides surround video
US9122053B2 (en) Realistic occlusion for a head mounted augmented reality display
WO2018171041A1 (en) Moving intelligent projection system and method therefor
CN106162203B (en) Panoramic video playback method, player and wear-type virtual reality device
CN101539804A (en) Real time human-machine interaction method and system based on augmented virtual reality and anomalous screen
CN107065409A (en) Trend projection arrangement and its method of work
WO2013155217A1 (en) Realistic occlusion for a head mounted augmented reality display
CN107027015A (en) 3D trends optical projection system based on augmented reality and the projecting method for the system
US10977852B2 (en) VR playing method, VR playing device, and VR playing system
CN106210856B (en) The method and system of 3D panoramic video are watched on internet video live broadcasting platform
CN207212211U (en) A kind of interactive intelligent window
CN206575538U (en) A kind of intelligent projection display system of trend
CN1204532C (en) Method and apparatus for rendering images with refractions
CN109255841A (en) AR image presentation method, device, terminal and storage medium
CN114125310B (en) Photographing method, terminal device and cloud server
Cubitt Making space
CN208506731U (en) Image display systems
CN106237588B (en) Multifunctional body-building system based on quadratic surface shadow casting technique
CN109240499B (en) Virtual camera simulation interaction control system and method, and information data processing terminal
Takatori et al. Large-scale projection-based immersive display: The design and implementation of largespace
CN110458929A (en) A kind of interiors rendering method and system based on Three.js
WO2019241712A1 (en) Augmented reality wall with combined viewer and camera tracking
US20230401791A1 (en) Landmark data collection method and landmark building modeling method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170104

RJ01 Rejection of invention patent application after publication