CN107145223A - Multi-point interaction control system and method based on Unity d engines and the VR helmets - Google Patents

Multi-point interaction control system and method based on Unity d engines and the VR helmets Download PDF

Info

Publication number
CN107145223A
CN107145223A CN201710219286.6A CN201710219286A CN107145223A CN 107145223 A CN107145223 A CN 107145223A CN 201710219286 A CN201710219286 A CN 201710219286A CN 107145223 A CN107145223 A CN 107145223A
Authority
CN
China
Prior art keywords
handles
client
data
helmets
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710219286.6A
Other languages
Chinese (zh)
Inventor
刘向升
潘杰
郑浩
张康杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Techlink Intelligent Polytron Technologies Inc
Original Assignee
Beijing Techlink Intelligent Polytron Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Techlink Intelligent Polytron Technologies Inc filed Critical Beijing Techlink Intelligent Polytron Technologies Inc
Priority to CN201710219286.6A priority Critical patent/CN107145223A/en
Publication of CN107145223A publication Critical patent/CN107145223A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2008Assembling, disassembling

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Architecture (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention relates to a kind of multi-point interaction control system based on Unity d engines and the VR helmets, it includes:Data acquisition module, for positional information of the corresponding VR handles of each collection client in its space;Data transmission module, for positional information of the corresponding VR handles of each client in its space to be transferred into server;Data are carried out Macro or mass analysis, identify the quantity at control point, obtain the mobile data of target object by data processing module;Data after Macro or mass analysis are returned to each client, and the mobile data of target object is fed back in the corresponding VR helmets of each client by interactive controlling module.The invention further relates to a kind of multi-point interaction control method based on Unity d engines and the VR helmets.

Description

Multi-point interaction control system and method based on Unity d engines and the VR helmets
Technical field
The invention belongs to industrial equipment machine & equipment technical field, and in particular to one kind is based on Unity d engines and VR The multi-point interaction control method of the helmet, it is adaptable to the interactive controlling of the various kinds of equipment part of industrial equipment machine & equipment analogue system Operation, is mainly used in the training of industrial equipment dismounting staff, equipment operation maintenance and maintainer.
Background technology
VR(Virtual Reality:Virtual reality) technology is a kind of can to create the computer with the experiencing virtual world Analogue system, it generates the interactive Three-Dimensional Dynamic what comes into a driver's that a kind of simulated environment is a kind of Multi-source Information Fusion using computer With the system emulation of entity behavior, user is set to be immersed in the environment.
Virtual reality technology (VR) is mainly included in terms of simulated environment, perception, natural technical ability and sensing equipment, analog loop Border is dynamic 3 D stereo photorealism generated by computer, real-time.Perception refers to that preferable VR should have all people The perception being had, in addition to the visually-perceptible that computer graphics techniques are generated, the also sense such as the sense of hearing, tactile, power feel, motion Know, or even also include smell and sense of taste etc., also referred to as to perceive, natural technical ability refers to the head rotation of people more, eyes, gesture or Other human body behavior acts, the data adaptable with the action of participant are handled by computer, and the input of user is made Real-time response, and the face of user are fed back to respectively, sensing equipment refers to three-dimension interaction equipment.
VR mainly uses a kind of simulated environment that computer is generated, and user is entered this by various sensing equipments In environment, the technology that the environment is directly interacted is realized.Because virtual reality technology is largely solved a lot Practical problem, also a saving the problems such as fund is not limited by environment, thus become current every field as industry, amusement, The technology that game, military, tourism etc. gain great popularity.The present and the future is interior for a long time, and VR technologies will be more and more ripe, gives people Bring increasing three-dimensional sensory experience.
VR machine & equipments are that current VR applies a more extensive field.Under some scene modes, machine & equipment is needed Wanting many people, cooperation carrys out disassembling operations jointly, and there is presently no the system and method for supporting this scene.
The content of the invention
In view of prior art lacks the intersection control routine that VR handles and virtual reality are combined closely of maturation Problem, it is proposed that by handle and virtual unit part interaction control method in a kind of reality environment of the invention, so as to Solve the above problems.
According to the first aspect of the invention, the present invention provides a kind of multiple spot based on Unity d engines and the VR helmets Intersection control routine, it includes:
First data acquisition module, for gathering position letter of the corresponding first VR handles of the first client in its space Breath;
Second data acquisition module, for gathering position letter of the corresponding 2nd VR handles of the second client in its space Breath;
First data transmission module, for the positional information by the corresponding first VR handles of the first client in its space It is transferred to server;
Second data transmission module, for the positional information by the corresponding 2nd VR handles of the second client in its space It is transferred to server;
Data are carried out Macro or mass analysis, identify the quantity at control point, obtain the movement of target object by data processing module Data;
Data after Macro or mass analysis are returned to the first client and the second client by interactive controlling module, and by object The mobile data of body is fed back in the corresponding first VR helmets of the first client and the corresponding 2nd VR helmets of the second client.
Preferably, the Macro or mass analysis that carried out to data is included the first VR handles and the 2nd VR handles in respective space Positional information be aggregated into the same space.
Preferably, the 2nd VR handles and the first VR handles are aggregated into the space in the same space by the interactive controlling module Position is fed back in the first VR helmets.
Preferably, the first VR handles and the 2nd VR handles are aggregated into the space in the same space by the interactive controlling module Position is fed back in the 2nd VR helmets.
Preferably, the first data transmission module and the second data transmission module are by way of wired or wireless connection It is real-time transmitted to server.
According to the second aspect of the invention, the present invention provides a kind of multiple spot based on Unity d engines and the VR helmets Interaction control method, it includes:
S110 gathers positional information of the corresponding VR handles of each client in its space;
Positional information of the corresponding VR handles of each client in its space is transferred to server by S120;
S130 carries out Macro or mass analysis to data, identifies the quantity at control point, obtains the mobile data of target object;
Data after Macro or mass analysis are returned to each client by S140, and the mobile data of target object is fed back in each client Hold in the corresponding VR helmets.
Preferably, it is described that the position by the corresponding VR handles of each client in its space is included to data progress Macro or mass analysis Confidence breath is aggregated into the same space.
Preferably, positional information of the VR handles in its space real-time Transmission by way of wired or wireless connection To server.
Preferably, positional information of the corresponding VR handles of each client in its space is real-time transmitted to client correspondence The VR helmets in.
Brief description of the drawings
By reading with reference to the detailed description made to non-limiting example that once accompanying drawing is made, of the invention other Feature, objects and advantages will become more apparent upon:
Fig. 1 is the systematic schematic diagram for showing schematically some embodiments of the present invention.
Fig. 2 is the system construction drawing for showing schematically some embodiments of the present invention.
Fig. 3 is the VR visual effect sectional drawings of some embodiments of the present invention.
Fig. 4 is the flow chart for the method for showing schematically some embodiments of the present invention.
Embodiment
In the following description, a large amount of concrete details are given to provide more thorough understanding of the invention.So And, it will be apparent to one skilled in the art that the present invention can be able to without one or more of these details Implement.In other examples, in order to avoid obscuring with the present invention, do not enter for some technical characteristics well known in the art Row description.
In the present invention, term " unity software ":Refer to the software based on Unity d engine platform developments.Unity It is that one developed by Unity Technologies allows player easily to create such as 3 D video game, building visualization, reality When the type interaction content such as three-dimensional animation multi-platform comprehensive development of games instrument, be the specialty integrated a comprehensively game Engine.
In the present invention, " HTC Vive Lighthouse " are the alignment systems that HTC VR equipment is used to term " Lighthouse ", is made up of two laser base stations:There is an infrared LED array in each base station, two rotating shafts are orthogonal Rotation Infrared laser emission device.Rotating speed is that 10ms mono- is enclosed.The working condition of base station is such:20ms is a circulation, In circulation, infrared LED is glistened at first, and the inswept player's moving havens domain of rotary laser of X-axis in 10ms, Y-axis is not sent out Light;The inswept player's moving havens domain of the rotary laser of Y-axis in lower 10ms, X-axis does not light.
In the present invention, term " multiparty control " refers in Same Scene, control of multiple handles to object.In VR Hand is simulated come Controlling model object by handle, part is implemented to catch the handle of action to be counted as a control point. Number of control points is identified, exactly identifies that current state part model has several handles and part to be collided and by handle (form that such as hand is identified as in Virtual Space) is bound.I.e. the control point handle is tied up in Virtual Space with part Fixed point, is moved to by multiple control points and is interacted with dummy object.
In certain embodiments of the present invention, two of Virtual Space are controlled by two handles simulation of VR equipment Hand, to realize the crawl control effect of dummy object.As shown in figure 3, two sets of VR equipment are empty by 4 handle co- controllings in figure Intend part, wherein there are 4 control points.
In other embodiments of the present invention, two of Virtual Space are controlled by two handles simulation of VR equipment Hand holds instrument (virtual tool), to realize the crawl control effect of dummy object (such as part).In the case, control point Point (such as part and the virtual work controlled by handle that can be indirectly bound by handle (hand in virtual scene) for dummy object The point of tool binding).
There is movement and the rotation process to object in real world, and the Virtual Space seen in the VR helmets is same The movement to dummy object and rotation process are needed, to reach the effect of simulation real world, so as to realize industrial virtual Training.
Fig. 1 shows the systematic schematic diagram of some embodiments of the present invention.First VR equipment and the 2nd VR equipment pass through Its respective client keeps data transfer and data sharing with server.Server is collected and turned to the data received Change, then pass through client transmissions to each VR equipment.So, in a VR equipment, " it can see " to another VR equipment Action, so as to realize the interaction in same Virtual Space.
The multi-point interaction control system of the present invention includes:Data acquisition module, data transfer module, parsing identification module and Interactive controlling module, as shown in Figure 2.
Data acquisition module:
S110 collecting device data messages.VR equipment handle data are obtained, each VR equipment has the space orientation of oneself Technology, obtains multiple operation handle positional informations by VR device spaces location technology, can be clearly seen hand in the VR helmets Model following handle movement.Such as HTC Lighthouse indoor positioning technologies belong to laser scanning location technology, by laser The position of moving object is determined with light sensor.Two generating lasers are positioned in diagonally, form the adjustable length of size Square region.Two rows of the laser beam inside transmitter fix LED and sent, 6 times per second.There are two in each generating laser Scan module, is both horizontally and vertically launching laser scanning located space to located space in turn respectively.Get VR handles Data include coordinate Position, rotation Rotation.
Data transmission module:
S120 device datas are transmitted.As shown in figure 1, the multiple equipment information of acquisition is sent from client to service end, The device data recorded in real time is transmitted by way of wired connection or wireless connection, is passed using the mode of custom protocol It is defeated, it is ensured that the efficient stable of data transfer, so that the picture seen in the helmet is clearly true to nature.
Data processing module:
S130 Data Analysis Services.According to the data received in service end, Macro or mass analysis is carried out to data, control is identified Make point quantity, by algorithm to multiple control point data weighted averages after, obtain the mobile data of target object.
Interactive controlling module:
S140 interactive controllings.By the data after handling by analysis, by customized procotol, passed back from service end To client, client according to the data feedback passed back into the VR helmets, as shown in Figure 1.Pass through the transmission of such real time data Update, can be clearly seen that target object is moved under the co- controlling with multiple handles in VR glasses.
Fig. 3 is the VR visual effect sectional drawings of some embodiments of the present invention, except the first VR of display in the first VR helmets Outside the position of handle in space, position of the 2nd VR handles in visual space can also be shown, it might even be possible to show Position of the two VR helmets in visual space, its principle is similar with the 2nd VR handles of display.
In certain embodiments of the present invention, the VR helmets of the first data collecting module collected the first and the first VR handles exist Positional information in the corresponding space of first client, and feeding back in the first VR display modules, while by the first VR heads The positional information of helmet and the first VR handles in the corresponding space of the first client passes through first data transmission module transfer to clothes Device and other clients of being engaged in are shared, the second client similarly, as shown in Figure 2.Interactive controlling module collects dissection process module The positional information that the corresponding VR helmets of each client and VR handles afterwards is converted in the same space is corresponding by each client Data transmission module is fed back in the corresponding VR display modules of each client, such as by the position of the 2nd VR handles and the 2nd VR helmets Confidence breath is fed back in the first VR display modules, so that it can be seen that the 2nd VR handles and the 2nd VR helmets in the first VR helmets Rotation and movement.
The invention is not restricted to above-mentioned embodiment, various changes can be carried out in the scope of the inventive concept.The present invention It is illustrated by above-described embodiment, but it is to be understood that, the purpose that above-described embodiment is only intended to illustrate and illustrated, And be not intended to limit the invention in described scope of embodiments.In addition it will be appreciated by persons skilled in the art that originally Invention is not limited to above-described embodiment, according to present invention teach that more kinds of variants and modifications can also be made, these modifications All fallen within modification within scope of the present invention.Protection scope of the present invention by the appended claims and its Equivalent scope is defined.

Claims (10)

1. a kind of multi-point interaction control system based on Unity d engines and the VR helmets, it includes:
First data acquisition module, for gathering positional information of the corresponding first VR handles of the first client in its space;
Second data acquisition module, for gathering positional information of the corresponding 2nd VR handles of the second client in its space;
First data transmission module, for positional information of the corresponding first VR handles of the first client in its space to be transmitted To server;
Second data transmission module, for positional information of the corresponding 2nd VR handles of the second client in its space to be transmitted To server;
Data are carried out Macro or mass analysis, identify the quantity at control point, obtain the mobile number of target object by data processing module According to;
Data after Macro or mass analysis are returned to the first client and the second client by interactive controlling module, and by target object Mobile data is fed back in the corresponding first VR helmets of the first client and the corresponding 2nd VR helmets of the second client.
2. system according to claim 1, wherein the logarithm includes the first VR handles and the according to Macro or mass analysis is carried out Positional information of the two VR handles in respective space is aggregated into the same space.
3. system according to claim 2, wherein the interactive controlling module converges the 2nd VR handles and the first VR handles Always the locus into the same space is fed back in the first VR helmets.
4. system according to claim 3, wherein the interactive controlling module converges the first VR handles and the 2nd VR handles Always the locus into the same space is fed back in the 2nd VR helmets.
5. system according to claim 1, wherein the first data transmission module and the second data transmission module pass through The mode of wired or wireless connection is real-time transmitted to server.
6. system according to claim 1, wherein the first data transmission module is empty at it by the first VR handles Between in positional information be transferred to the first VR helmets.
7. system according to claim 1, wherein second data transmission module is empty at it by the 2nd VR handles Between in positional information be transferred to the 2nd VR helmets.
8. a kind of multi-point interaction control method based on Unity d engines and the VR helmets, it includes:
S110 gathers positional information of the corresponding VR handles of each client in its space;
Positional information of the corresponding VR handles of each client in its space is transferred to server by S120;
S130 carries out Macro or mass analysis to data, identifies the quantity at control point, obtains the mobile data of target object;
Data after Macro or mass analysis are returned to each client by S140, and the mobile data of target object is fed back in each client pair In the VR helmets answered.
9. method according to claim 8, wherein the logarithm according to carry out Macro or mass analysis include each client is corresponding Positional information of the VR handles in its space is aggregated into the same space.
10. method according to claim 8, wherein positional information of the corresponding VR handles of each client in its space is real When be transferred in the corresponding VR helmets of the client.
CN201710219286.6A 2017-04-06 2017-04-06 Multi-point interaction control system and method based on Unity d engines and the VR helmets Pending CN107145223A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710219286.6A CN107145223A (en) 2017-04-06 2017-04-06 Multi-point interaction control system and method based on Unity d engines and the VR helmets

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710219286.6A CN107145223A (en) 2017-04-06 2017-04-06 Multi-point interaction control system and method based on Unity d engines and the VR helmets

Publications (1)

Publication Number Publication Date
CN107145223A true CN107145223A (en) 2017-09-08

Family

ID=59775220

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710219286.6A Pending CN107145223A (en) 2017-04-06 2017-04-06 Multi-point interaction control system and method based on Unity d engines and the VR helmets

Country Status (1)

Country Link
CN (1) CN107145223A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107890670A (en) * 2017-11-27 2018-04-10 浙江卓锐科技股份有限公司 A kind of scenic spot VR interactive systems based on unity engines
CN108303677A (en) * 2018-02-08 2018-07-20 高强 A kind of system and method for the Space Expanding based on Vive Lighthouse
CN108333579A (en) * 2018-02-08 2018-07-27 高强 A kind of system and method for the light sensation equipment dense deployment based on Vive Lighthouse
CN109085925A (en) * 2018-08-21 2018-12-25 福建天晴在线互动科技有限公司 It is a kind of realize MR mixed reality interaction method, storage medium
CN109529318A (en) * 2018-11-07 2019-03-29 艾葵斯(北京)科技有限公司 Virtual vision system
CN109782907A (en) * 2018-12-28 2019-05-21 西安交通大学 A kind of virtual filling coorinated training system based on polyhybird real world devices
CN109921818A (en) * 2019-03-08 2019-06-21 长春理工大学 A kind of intelligence multiple spot real-time three-dimensional interaction helmet-mounted display system
TWI688900B (en) * 2018-01-08 2020-03-21 宏達國際電子股份有限公司 Reality system and control method suitable for head-mounted devices located in physical environment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955463A (en) * 2016-04-26 2016-09-21 王立峰 BIM (Building Information Modeling)-based VR (Virtual Reality) virtual feeling system
CN106445176A (en) * 2016-12-06 2017-02-22 腾讯科技(深圳)有限公司 Man-machine interaction system and interaction method based on virtual reality technique
CN106534125A (en) * 2016-11-11 2017-03-22 厦门汇鑫元软件有限公司 Method for realizing VR multi-person interaction system on the basis of local area network

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955463A (en) * 2016-04-26 2016-09-21 王立峰 BIM (Building Information Modeling)-based VR (Virtual Reality) virtual feeling system
CN106534125A (en) * 2016-11-11 2017-03-22 厦门汇鑫元软件有限公司 Method for realizing VR multi-person interaction system on the basis of local area network
CN106445176A (en) * 2016-12-06 2017-02-22 腾讯科技(深圳)有限公司 Man-machine interaction system and interaction method based on virtual reality technique

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107890670A (en) * 2017-11-27 2018-04-10 浙江卓锐科技股份有限公司 A kind of scenic spot VR interactive systems based on unity engines
TWI688900B (en) * 2018-01-08 2020-03-21 宏達國際電子股份有限公司 Reality system and control method suitable for head-mounted devices located in physical environment
US10600205B2 (en) 2018-01-08 2020-03-24 Htc Corporation Anchor recognition in reality system
US11120573B2 (en) 2018-01-08 2021-09-14 Htc Corporation Anchor recognition in reality system
CN108303677A (en) * 2018-02-08 2018-07-20 高强 A kind of system and method for the Space Expanding based on Vive Lighthouse
CN108333579A (en) * 2018-02-08 2018-07-27 高强 A kind of system and method for the light sensation equipment dense deployment based on Vive Lighthouse
CN109085925A (en) * 2018-08-21 2018-12-25 福建天晴在线互动科技有限公司 It is a kind of realize MR mixed reality interaction method, storage medium
CN109085925B (en) * 2018-08-21 2021-07-27 福建天晴在线互动科技有限公司 Method and storage medium for realizing MR mixed reality interaction
CN109529318A (en) * 2018-11-07 2019-03-29 艾葵斯(北京)科技有限公司 Virtual vision system
CN109782907A (en) * 2018-12-28 2019-05-21 西安交通大学 A kind of virtual filling coorinated training system based on polyhybird real world devices
CN109921818A (en) * 2019-03-08 2019-06-21 长春理工大学 A kind of intelligence multiple spot real-time three-dimensional interaction helmet-mounted display system
CN109921818B (en) * 2019-03-08 2021-07-20 长春理工大学 Intelligent multi-point real-time three-dimensional interactive helmet display system

Similar Documents

Publication Publication Date Title
CN107145223A (en) Multi-point interaction control system and method based on Unity d engines and the VR helmets
US10109113B2 (en) Pattern and method of virtual reality system based on mobile devices
CN107820593B (en) Virtual reality interaction method, device and system
CN106652590B (en) Teaching method, teaching identifier and tutoring system
CN107145222B (en) The automatic binding system of tool and method based on Unity d engine and VR equipment
CN105279795B (en) Augmented reality system based on 3D marker
CN107656505A (en) Use the methods, devices and systems of augmented reality equipment control man-machine collaboration
CN109475774A (en) Spectators' management at view location in reality environment
US11049324B2 (en) Method of displaying virtual content based on markers
CN107548470A (en) Nip and holding gesture navigation on head mounted display
CN102508363A (en) Wireless display glasses based on augmented-reality technology and implementation method for wireless display glasses
CN109696961A (en) Historical relic machine & equipment based on VR technology leads reward and realizes system and method, medium
CN110083235A (en) Interactive system and data processing method
CN110335359A (en) Distribution board firing accident emergency drilling analogy method based on virtual reality technology
CN109961520A (en) A kind of classroom VR/MR and its construction method based on third visual angle technology
CN111694426A (en) VR virtual picking interactive experience system, method, electronic equipment and storage medium
CN112121406A (en) Object control method and device, storage medium and electronic device
Rajappa et al. Application and scope analysis of Augmented Reality in marketing using image processing technique
CN117333644A (en) Virtual reality display picture generation method, device, equipment and medium
CN110741327B (en) Mud toy system and method based on augmented reality and digital image processing
CN202142008U (en) Interactive device
EP2830728B1 (en) Entertainment system and method of providing entertainment
CN109901717A (en) A kind of virtual reality is raced interactive system more
CN110109550A (en) A kind of VR immersion is outer planet detection demo system
CN110013669A (en) A kind of virtual reality is raced exchange method more

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170908