WO2019109492A1 - 虚拟现实场景模拟装置 - Google Patents

虚拟现实场景模拟装置 Download PDF

Info

Publication number
WO2019109492A1
WO2019109492A1 PCT/CN2018/074245 CN2018074245W WO2019109492A1 WO 2019109492 A1 WO2019109492 A1 WO 2019109492A1 CN 2018074245 W CN2018074245 W CN 2018074245W WO 2019109492 A1 WO2019109492 A1 WO 2019109492A1
Authority
WO
WIPO (PCT)
Prior art keywords
platform
virtual reality
processing device
virtual
spherical
Prior art date
Application number
PCT/CN2018/074245
Other languages
English (en)
French (fr)
Inventor
张贯京
葛新科
王海荣
张红治
周亮
Original Assignee
深圳市易特科信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市易特科信息技术有限公司 filed Critical 深圳市易特科信息技术有限公司
Publication of WO2019109492A1 publication Critical patent/WO2019109492A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the utility model relates to the technical field of virtual reality (VR), in particular to a virtual reality scene simulation device.
  • VR virtual reality
  • Virtual reality is a new generation of information interaction technology. In recent years, it has been continuously developed and improved, and has been widely used in various fields and industries, and has a good effect on people's perception experience.
  • the virtual space created by combining various scientific technologies such as simulation, smart sensor and graphic display gives users the experience of real-life scenes.
  • the existing VR technology relies mainly on vision, hearing, and The semi-interactive experience of head motion capture or gesture capture, while the touch capture is not mature enough to greatly affect the user's interactive experience.
  • the main purpose of the present invention is to provide a virtual reality scene simulation device, which aims to solve the technical problem that the existing virtual scene interaction device is mainly limited to visual and auditory interaction.
  • the present invention provides a virtual reality scene simulation device, which includes a wearable device, a collection device, and a processing device.
  • the collection device is disposed on a platform, and the processing device is placed on the platform.
  • the processing device is respectively connected to the wearable device and the collecting device by wire or wirelessly, wherein:
  • the platform includes a spherical prop, a non-spherical prop, and a label layer.
  • the label layer is disposed on the platform, and the spherical prop and the non-spherical prop are placed on the label layer, and the non-spherical prop and the spherical prop are respectively provided with the first label.
  • the label layer comprises a matrix structure composed of a plurality of first labels, each of the first labels is a basic unit, and a plurality of basic units are extended to the periphery to form a matrix structure with the center point of the platform as an origin;
  • the wearable device includes VR glasses, a data interaction device, and a second tag, and the VR glasses are provided with a second reader, and the data interaction device is respectively connected to the VR glasses and the processing device by wire or wirelessly;
  • the collecting device includes a hub, a data line and a first reader, the first readers are respectively disposed at a corner of the platform, and the first reader is electrically connected to the processing device through a hub and a data line;
  • the processing device includes a wireless transmitting and receiving device, an input and output unit, a display, a processor, and a power source.
  • the processor is electrically connected to the power source and the input and output unit, respectively, and the input and output unit are also electrically connected to the display and the wireless transmitting and receiving device, respectively.
  • the wireless transmitting device is paired with the wearable device.
  • the platform is provided with a concave topography and a flat topography, and the spherical props are placed on the label layer at the concave topography.
  • the non-spherical prop is composed of a plurality of sides, and each side of the non-spherical prop is respectively provided with a first label.
  • the data interaction device is further provided with a battery for providing electrical energy to the wearable device.
  • the processing device further comprises a memory electrically connected to the processor, the storage device for storing a 3D picture of the virtual liquid material, a 3D picture of the virtual object, an ambient sound, and a simulation algorithm solution.
  • the processing device further comprises a tag reader/writer electrically connected to the processor, the tag reader/writer for different identification information respectively written in the first tag and the second tag.
  • the virtual scene simulation device of the present invention has the beneficial effects of cutting into three aspects: visual, auditory and tactile.
  • the simulation prop is added to simulate various environments.
  • the virtual scene realizes that the simulation props in the platform are synchronized with the virtual scene image, so as to solve the current situation that the existing virtual scene interaction cannot be mainly limited to the visual and auditory state, and the real experience of the user is improved.
  • FIG. 1 is a schematic diagram of application of a virtual reality scene simulation device of the present invention
  • FIG. 2 is a schematic diagram of a virtual scene screen
  • Figure 3 is a schematic structural view of the platform
  • FIG. 4 is a schematic structural view of the label layer of FIG. 3;
  • FIG. 5 is a schematic structural view of the wearable device of FIG. 1;
  • Figure 6 is a schematic structural view of the collecting device of Figure 1;
  • FIG. 7 is a schematic structural view of the processing apparatus of FIG. 1.
  • FIG. 7 is a schematic structural view of the processing apparatus of FIG. 1.
  • FIG. 1 is a schematic diagram of application of a virtual reality scene simulation device according to the present invention
  • FIG. 2 is a schematic diagram of a virtual scene screen.
  • the virtual reality scene simulation device includes a wearable device 2, a collection device 3, and a processing device 4; the wearable device 2 is worn on the user 5, the user 5 can move within the platform 1, and the collection device 3 is disposed on the platform. 1 , the processing device 3 is placed outside the platform 1; the processing device 4 is connected to the wearable device 2 and the collecting device 3 respectively by wire or wirelessly.
  • the platform 1 is configured to simulate a virtual scene screen 6 (as shown in FIG. 2 ), and the collecting device 3 is configured to acquire identification information in the platform 1 and transmit the identification information to the processing device 4 by wire or wirelessly. After processing by the processing device 4, the virtual scene screen 6 is obtained, and the virtual scene screen 6 is transmitted back to the wearable device 2 by wire or wirelessly, and the virtual scene screen 6 is displayed to the user 5.
  • FIG. 3 is a schematic structural view of the platform
  • Figure 4 is a schematic structural view of the label layer of Figure 3.
  • the platform 1 includes a spherical prop 11 , a non-spherical prop 12 , and a label layer 13 .
  • the platform 1 is provided with a recessed terrain 14 and a flat topography 15 .
  • the label layer 13 is disposed on the platform 1 , and the spherical prop 11 is placed on the concave terrain 14 .
  • the non-spherical props 12 are placed on the label layer 13 at the flat topography 15; the surface of the spherical props 11 is provided with a first label 16 (for example, a passive RFID tag in the prior art).
  • a first label 16 for example, a passive RFID tag in the prior art
  • the non-spherical props 12 may be composed of a plurality of sides, and each side of the non-spherical props 12 is respectively provided with a first label 16; the label layer 13 includes a matrix structure composed of a plurality of first labels 16 ( As shown in FIG. 4, each of the first labels 16 serves as a basic unit 17, and each basic unit 17 area range is determined according to the range that the collecting device 3 can collect, and the plurality of basic units 17 are centered on the center point of the platform 1.
  • the surrounding extensions form a matrix structure.
  • the spherical props 11 are used to simulate virtual liquid objects 61 (such as water and swamps, etc.) in the virtual scene screen 6, and the non-spherical props 12 are used to simulate virtual objects 62 (such as mountains, trees, and houses) in the virtual scene screen 6.
  • virtual liquid objects 61 such as water and swamps, etc.
  • non-spherical props 12 are used to simulate virtual objects 62 (such as mountains, trees, and houses) in the virtual scene screen 6.
  • the first label 16 is pre-set with different identification information
  • the collection device 3 is configured to acquire the identification information corresponding to the first label 16 and transmit the identification information to the processing device 3, and the processing device 3 obtains
  • the identification information is processed and obtained to obtain a 3D picture of the virtual liquid object 61 and the virtual object 62 in the virtual scene picture 6, the label layer 13 is used to locate the coordinates of each point in the platform 1; since the label layer 13 of the present invention adopts a matrix
  • the type structure provides a reference point for precisely positioning the coordinates of each point in the platform 1 and forms a coordinate system with the center of the platform 1 as the origin. When the user 5 enters the platform 1, the user 5 is determined according to the generated coordinate system. Coordinate position.
  • FIG. 5 is a schematic structural view of the wearable device of FIG. 1.
  • the wearable device 2 includes VR glasses 21 (for example, a VR glasses integrated machine in the prior art), a data interaction device 22, and a second tag 23 (for example, a passive RFID tag in the related art); the VR glasses 21 Dressed in the head of the user 5, the data interaction device 22 is worn on the back of the user 5, and the second tag 23 is worn on the foot of the user 5 (as shown in FIG. 5); the data interaction device 22 is also provided.
  • the battery 221 is provided with a second reader 24 (for example, an RFID tag reader with a linearly polarized antenna in the prior art), and the data interaction device 22 is connected to the VR glasses 21 by wire or wirelessly, respectively.
  • processing device 4 for example, an RFID tag reader with a linearly polarized antenna in the prior art
  • the battery 221 is configured to provide power for each device of the wearable device, the VR glasses 21 are used to display a virtual scene screen 6, and the second label 23 worn on the leg of the user 5 is used to locate the user 5
  • the second reader 24 is disposed on the VR glasses 21 for acquiring the identification information and the power intensity information of the first label 16 in front of the user 5.
  • the acquired identification information and power intensity information of the first label 16 are obtained.
  • the data interaction device 22 transmits the processing to the processing device 3 for processing, and the processing device 3 integrates the processing to obtain a viewing angle screen 63 (a partial screen of the virtual scene screen 6 that the user 5 can observe), and the viewing angle screen 63 is transmitted back by wire or wirelessly.
  • the data interaction device 22 sends the returned perspective view 63 to the display at the VR glasses 21.
  • FIG. 6 is a schematic structural diagram of the collecting device of FIG.
  • the collecting device 3 includes a hub 33, a data line 32, and a first reader 31 (for example, an RFID tag reader with a circularly polarized antenna in the prior art); the first readers 31 are respectively disposed at the corners of the platform 1; The first reader 31 is electrically coupled to the processing device 3 via a hub 33 and a data line 32.
  • a first reader 31 for example, an RFID tag reader with a circularly polarized antenna in the prior art
  • the first reader 31 is electrically coupled to the processing device 3 via a hub 33 and a data line 32.
  • the first reader 31 is configured to acquire identification information corresponding to the first tag 16, and the first reader 31 further has a receiving and transmitting power recording function, configured to record the first reader 31 to identify the first tag 16 and the second When the tag 23 receives the power intensity information of the signal, the identification information and the power intensity information acquired by the first reader 31 are integrated by the hub 33 and transmitted to the processing device 3 via the data line 32.
  • FIG. 7 is a schematic structural view of the processing apparatus of FIG. 1.
  • the processing device 4 includes a wireless transmitting and receiving device 41, an input and output unit 42, a display 43, a processor 44, a storage 45, a power source 46, and a tag reader/writer 47; the processor 44 is respectively connected to the storage unit 45 and the power source 46.
  • the input/output unit 42 is electrically connected, and the input/output unit 42 is also electrically connected to the tag reader/writer 47, the display 43, and the wireless transmitting and receiving device 41, respectively, and the wireless transmitting device 41 is paired with the wearable device 2.
  • the tag reader/writer 47 is used for different identification information respectively written in the first tag 16 and the second tag 23, the display 43 is for displaying a simulation condition of the virtual scene screen 6, and the storage 45 is for storing The 3D picture of the virtual liquid material 61, the 3D picture of the virtual object 62, the ambient sound, and the simulation algorithm scheme, the processor 44 is configured to extract a 3D picture of the virtual liquid substance 61 corresponding to the corresponding pair of identification information, and a 3D picture of the virtual object 62, And using the positioning simulation algorithm scheme to calculate the power intensity information of the first tag 16 acquired by the first reader to generate a coordinate system, and importing the 3D image of the virtual liquid object 61 and the 3D image of the virtual object 62 into the coordinate system to form a virtual
  • the wireless receiving and transmitting device 41 is configured to receive or transmit data signals
  • the power source 46 provides power for each device
  • the input and output unit 42 is used for information input and output
  • the pre-stored in the storage 45 The virtual liquids 61, virtual objects
  • the embodiment of the present invention provides a virtual reality scene simulation device.
  • a real scene is added to simulate a virtual scene in various environments, and the simulation props in the platform are synchronized with the virtual scene image, and the user is synchronized.
  • the platform moves to view the virtual scene picture, and through the body touches the simulation props in the platform, so that the user has an immersive feeling.
  • the utility model realizes synchronization between the simulation device and the virtual scene from three aspects of visual, auditory and tactile, thereby improving the real experience of the user in the virtual scene.
  • the virtual scene simulation device of the present invention has the beneficial effects of cutting into three aspects: visual, auditory and tactile.
  • the simulation prop is added to simulate various environments.
  • the virtual scene realizes that the simulation props in the platform are synchronized with the virtual scene image, so as to solve the current situation that the existing virtual scene interaction cannot be mainly limited to the visual and auditory state, and the real experience of the user is improved.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本实用新型提供了一种虚拟现实场景模拟装置,该虚拟现实场景模拟装置包括可穿戴设备、采集装置以及处理装置。所述采集装置设在平台上,处理装置放置于平台外,所述处理装置分别与可穿戴设备、采集装置通过有线或无线方式连接。所述平台内设有的模拟道具用于模拟各种虚拟场景,所述采集装置用于获取场景内的信息并通过处理装置处理,并生成虚拟场景画面。本实用新型所述的虚拟现实场景模拟装置能够实现平台内的模拟道具与虚拟场景画面同步,使用者在平台移动观看的虚拟场景画面同时,并通过身体接触到平台内的模拟道具,使使用者有身临其境之感。

Description

虚拟现实场景模拟装置 技术领域
本实用新型涉及一种虚拟现实(VR)的技术领域,尤其涉及一种虚拟现实场景模拟装置。
背景技术
虚拟现实是新生代的信息交互技术,近年来,它不断发展和完善,迅速在各个领域和行业都得到了广泛应用,对人们的知觉体验有着良好的增强作用。在VR技术方面,通过将仿真、智能传感器与图形显示等多种科学技术相结合创建的虚拟空间,给予用户身临现实场景的体验,现有的VR技术,主要依赖于视觉、听觉,并根据头部运动捕捉或手势捕捉的半交互体验,而触觉捕捉尚不成熟,大大影响用户的交互体验度。
技术问题
本实用新型主要目的提供一种虚拟现实场景模拟装置,旨在解决现有虚拟场景交互设备主要局限于视觉、听觉交互的技术问题。
技术解决方案
为实现上述目的,本实用新型提供了一种虚拟现实场景模拟装置,所述虚拟现实场景模拟装置包括可穿戴设备、采集装置以及处理装置,所述采集装置设在平台上,处理装置放置于平台外,所述处理装置分别与可穿戴设备、采集装置通过有线或无线方式连接,其中:
所述平台包括球状道具、非球状道具以及标签层,所述标签层设在平台上,球状道具和非球状道具放置于标签层上,所述非球状道具和球状道具分别设有第一标签,所述标签层包括多个第一标签组成的矩阵型结构,每一个第一标签作为一个基本单位,多个基本单位以平台的中心点为原点向周围扩展组成矩阵型结构;
所述可穿戴设备包括VR眼镜、数据交互设备以及第二标签,所述VR眼镜处设有第二阅读器,所述数据交互设备分别通过有线或无线方式连接至VR眼镜和处理装置;
所述采集装置包括集线器、数据线以及第一阅读器,所述第一阅读器分别设在平台的角落,所述第一阅读器通过集线器和数据线电连接至处理装置;
所述处理装置包括无线收发射设备、输入输出单元、显示器、处理器、电源,所述处理器分别与电源和输入输出单元电连接,输入输出单元还分别与显示器和无线收发射设备电连接,无线收发射设备与可穿戴设备配对连接。
优选的,所述平台内设有凹陷地形和平坦地形,所述球状道具放置于凹陷地形处的标签层上。
优选的,所述非球状道具由多个侧面组成,该非球状道具的每一个侧面分别设有一个第一标签。
优选的,所述数据交互设备还设有蓄电池,所述蓄电池用于为可穿戴设备提供电能。
优选的,所述处理装置还包括电连接至处理器的储存器,所述储存器用于储存虚拟液态物的3D画面、虚拟物体的3D画面、环境声音以及仿真算法方案。
优选的,所述处理装置还包括电连接至处理器的标签读写器,所述标签读写器用于在第一标签和第二标签分别写入的不同标识信息。
有益效果
相较于现有技术,本实用新型所述虚拟场景模拟装置的有益效果:从视觉、听觉、触觉三个方面切入,在现有VR的基础上,增加模拟道具用于模拟各种环境下的虚拟场景,实现平台内的模拟道具与虚拟场景画面同步,从而解决现有技术无法目前现有虚拟场景交互主要局限于视觉、听觉的现状,提高使用者的真实体验度。
附图说明
图1为本实用新型虚拟现实场景模拟装置的应用示意图;
图2为虚拟场景画面的示意图;
图3为平台的结构示意图;
图4为图3中标签层的结构示意图;
图5为图1中可穿戴设备的结构示意图;
图6为图1中采集装置的结构示意图;
图7为图1中处理装置的结构示意图。
本实用新型目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。
本发明的最佳实施方式
应当理解,此处描述的具体实施例仅仅用于解释本实用新型,并不用于限定本实用新型。
方便参考图1和2所示,图1为本发明虚拟现实场景模拟装置的应用示意图,图2为虚拟场景画面的示意图。所述虚拟现实场景模拟装置包括可穿戴设备2、采集装置3以及处理装置4;所述可穿戴设备2穿戴在使用者5身上,使用者5可在平台1内移动,采集装置3设在平台1上,处理装置3放置于平台1外;所述处理装置4分别与可穿戴设备2、采集装置3通过有线或无线方式连接。
所述平台1用于模拟虚拟场景画面6(如图2所示),所述采集装置3用于获取平台1内的标识信息,并通过有线或无线方式将标识信息传输至处理装置4处,经处理装置4处理后得到虚拟场景画面6,将虚拟场景画面6通过有线或无线方式传回至可穿戴设备2处,并向使用者5展示虚拟场景画面6。
参考图3和4所示,图3为平台的结构示意图,图4为图3中标签层的结构示意图。所述平台1包括球状道具11、非球状道具12以及标签层13;所述平台1内设有凹陷地形14和平坦地形15,标签层13设在平台1上,球状道具11放置于凹陷地形14处的标签层13上,非球状道具12放置于平坦地形15处的标签层13上;所述球状道具11的表面设有一个第一标签16(例如现有技术中的无源RFID标签),所述非球状道具12可以为由多个侧面组成,该非球状道具12的每一个侧面分别设有一个第一标签16;所述标签层13包括多个第一标签16组成的矩阵型结构(如图4所示),每一个第一标签16作为一个基本单位17,每个基本单位17区域范围根据采集装置3可采集的范围决定,多个基本单位17以平台1的中心点为原点向周围扩展组成矩阵型结构。
所述球状道具11用于模拟虚拟场景画面6 中的虚拟液态物61(例如水和沼泽等),所述非球状道具12用于模拟虚拟场景画面6 中虚拟物体62(例如山、树以及房屋等),所述各个第一标签16内分别预设有不同的标识信息,所述采集装置3用于获取第一标签16所对应的标识信息并传至处理装置3处,处理装置3通过获取的标识信息并进行处理得到虚拟场景画面6中虚拟液态物61和虚拟物体62的3D画面,所述标签层13用于定位平台1内各个点的坐标;由于本发明所述标签层13采用矩阵型结构,为精确定位平台1内各个点的坐标提供参考点,并形成以平台1中心为原点的坐标系,当使用者5进入平台1内时,根据生成的坐标系确定使用者5所处坐标位置。
参考图5所示,图5为图1中可穿戴设备的结构示意图。所述可穿戴设备2包括VR眼镜21(例如现有技术中的VR眼镜一体机)、数据交互设备22以及第二标签23(例如现有技术中的无源RFID标签);所述VR眼镜21穿戴在使用者5的头部,数据交互设备22穿戴在使用者5的背部,第二标签23穿戴在使用者5的脚部(如图5所示);所述数据交互设备22还设有蓄电池221,所述VR眼镜21设有第二阅读器24(例如现有技术中带线极化天线的RFID标签阅读器),所述数据交互设备22分别通过有线或无线方式连接至VR眼镜21和处理装置4。
所述蓄电池221用于为可穿戴设备的各设备提供电能,所述VR眼镜21用于显示虚拟场景画面6,所述穿戴在使用者5腿部的第二标签23用于定位使用者5所处位置,所述设在VR眼镜21上的第二阅读器24用于获取使用者5前方第一标签16的标识信息和功率强度信息,获取的第一标签16的标识信息和功率强度信息经数据交互设备22传输至处理装置3处进行处理,经处理装置3整合处理得到视角画面63(使用者5可观察到的虚拟场景画面6的部分画面),视角画面63通过有线或无线方式传回至数据交互设备22处,数据交互设备22将传回的视角画面63发送至VR眼镜21处显示。
参考图6,图6为图1中采集装置的结构示意图。所述采集装置3包括集线器33、数据线32以及第一阅读器31(例如现有技术中带有圆极化天线的RFID标签阅读器);第一阅读器31分别设在平台1的角落;所述第一阅读器31通过集线器33和数据线32电连接至处理装置3。
所述第一阅读器31用于获取第一标签16所对应的标识信息,第一阅读器31还具备收发射功率记录功能,用于记录第一阅读器31在识别第一标签16和第二标签23时接收信号的功率强度信息,第一阅读器31获取的标识信息和功率强度信息通过集线器33整合经数据线32传至处理装置3处。
参考图7所示,图7为图1中处理装置的结构示意图。所述处理装置4包括无线收发射设备41、输入输出单元42、显示器43、处理器44、储存器45、电源46、标签读写器47;所述处理器44分别与储存器45、电源46以及输入输出单元42电连接,输入输出单元42还分别与标签读写器47、显示器43、无线收发射设备41电连接,无线收发射设备41与可穿戴设备2配对连接。
所述标签读写器47用于在第一标签16和第二标签23分别写入的不同标识信息,所述显示器43用于显示虚拟场景画面6的模拟状况,所述储存器45用于储存虚拟液态物61的3D画面、虚拟物体62的3D画面、环境声音以及仿真算法方案,所述处理器44用于提取标识信息相应对的虚拟液态物61的3D画面、虚拟物体62的3D画面,并使用定位仿真算法方案计算第一阅读器获取的第一标签16的功率强度信息生成坐标系,通过仿真算法方案将虚拟液态物61的3D画面和虚拟物体62的3D画面导入到坐标系形成虚拟场景画面6,所述无线收发射设备41用于接收或是发射数据信号,所述电源46为各设备提供电能,所述输入输出单元42用于信息输入输出,所述预存于储存器45内的虚拟液态物61、虚拟物体62、环境声音以及仿真算法方案可通过输入输出单元42进行更新或者升级。
本实用新型实施例提供了虚拟现实场景模拟装置,在现有VR技术的基础上,增加现实场景用于模拟各种环境下的虚拟场景,实现平台内的模拟道具与虚拟场景画面同步,使用者在平台移动观看的虚拟场景画面同时,并通过身体接触到平台内的模拟道具,使使用者有身临其境之感。本实用新型通过利用模拟装置和虚拟场景的相结合,从视觉、听觉、触觉三个方面,实现模拟装置与虚拟场景的同步,从而提高使用者在虚拟场景中的真实体验感。
以上仅为本实用新型的较佳实施例,并非因此限制本实用新型的专利范围,凡是利用本实用新型说明书及附图内容所作的等效结构或等效功能变换,或直接或间接运用在其他相关的技术领域,均同理包括在本实用新型的专利保护范围内。
工业实用性
相较于现有技术,本实用新型所述虚拟场景模拟装置的有益效果:从视觉、听觉、触觉三个方面切入,在现有VR的基础上,增加模拟道具用于模拟各种环境下的虚拟场景,实现平台内的模拟道具与虚拟场景画面同步,从而解决现有技术无法目前现有虚拟场景交互主要局限于视觉、听觉的现状,提高使用者的真实体验度。

Claims (6)

  1. 一种虚拟现实场景模拟装置,其特征在于,该虚拟现实场景模拟装置包括可穿戴设备、采集装置以及处理装置,所述采集装置设在平台上,处理装置放置于平台外,所述处理装置分别与可穿戴设备、采集装置通过有线或无线方式连接,其中:所述平台包括球状道具、非球状道具以及标签层,所述标签层设在平台上,球状道具和非球状道具放置于标签层上,所述非球状道具和球状道具分别设有第一标签,所述标签层包括多个第一标签组成的矩阵型结构,每一个第一标签作为一个基本单位,多个基本单位以平台的中心点为原点向周围扩展组成矩阵型结构;所述可穿戴设备包括VR眼镜、数据交互设备以及第二标签,所述VR眼镜处设有第二阅读器,所述数据交互设备分别通过有线或无线方式连接至VR眼镜和处理装置;所述采集装置包括集线器、数据线以及第一阅读器,所述第一阅读器分别设在平台的角落,所述第一阅读器通过集线器和数据线电连接至处理装置;所述处理装置包括无线收发射设备、输入输出单元、显示器、处理器、电源,所述处理器分别与电源和输入输出单元电连接,输入输出单元还分别与显示器和无线收发射设备电连接,无线收发射设备与可穿戴设备配对连接。
  2. 如权利要求1所述的虚拟现实场景模拟装置,其特征在于,所述平台内设有凹陷地形和平坦地形,所述球状道具放置于凹陷地形处的标签层上。
  3. 如权利要求1所述的虚拟现实场景模拟装置,其特征在于,所述非球状道具由多个侧面组成,该非球状道具的每一个侧面分别设有一个第一标签。
  4. 如权利要求1所述的虚拟现实场景模拟装置,其特征在于,所述数据交互设备还设有蓄电池,所述蓄电池用于为可穿戴设备提供电能。
  5. 如权利要求1所述的虚拟现实场景模拟装置,其特征在于,所述处理装置还包括电连接至处理器的储存器,所述储存器用于储存虚拟液态物的3D画面、虚拟物体的3D画面、环境声音以及仿真算法方案。
  6. 如权利要求1所述的虚拟现实场景模拟装置,其特征在于,所述处理装置还包括电连接至处理器的标签读写器,所述标签读写器用于在第一标签和第二标签分别写入不同的标识信息。
PCT/CN2018/074245 2017-12-04 2018-01-26 虚拟现实场景模拟装置 WO2019109492A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201721660460.2U CN207488953U (zh) 2017-12-04 2017-12-04 虚拟现实场景模拟装置
CN201721660460.2 2017-12-04

Publications (1)

Publication Number Publication Date
WO2019109492A1 true WO2019109492A1 (zh) 2019-06-13

Family

ID=62457554

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/074245 WO2019109492A1 (zh) 2017-12-04 2018-01-26 虚拟现实场景模拟装置

Country Status (2)

Country Link
CN (1) CN207488953U (zh)
WO (1) WO2019109492A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113433837A (zh) * 2021-06-15 2021-09-24 浙江水利水电学院 一种基于vr的室内设计方法及系统

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109933207A (zh) * 2019-04-02 2019-06-25 黄立新 虚拟现实环境的触觉产生模拟方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101435872A (zh) * 2008-12-23 2009-05-20 郑之敏 Rfid矩阵分布式人员定位监测系统及其监测方法
CN103097907A (zh) * 2010-08-27 2013-05-08 贝内迪克特·希罗尼米 用于检测高频收发器的系统及其用途
CN104024984A (zh) * 2011-10-05 2014-09-03 弗兰霍菲尔运输应用研究公司 便携式设备、虚拟现实系统及方法
CN104704535A (zh) * 2012-10-02 2015-06-10 索尼公司 增强现实系统
CN105183142A (zh) * 2014-06-13 2015-12-23 中国科学院光电研究院 一种利用空间位置装订的数字信息复现方法
CN105373224A (zh) * 2015-10-22 2016-03-02 山东大学 一种基于普适计算的混合现实游戏系统及方法
US9599821B2 (en) * 2014-08-08 2017-03-21 Greg Van Curen Virtual reality system allowing immersion in virtual space to consist with actual movement in actual space
CN107167132A (zh) * 2016-03-07 2017-09-15 上海积杉信息科技有限公司 基于增强现实与虚拟现实的室内定位系统

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101435872A (zh) * 2008-12-23 2009-05-20 郑之敏 Rfid矩阵分布式人员定位监测系统及其监测方法
CN103097907A (zh) * 2010-08-27 2013-05-08 贝内迪克特·希罗尼米 用于检测高频收发器的系统及其用途
CN104024984A (zh) * 2011-10-05 2014-09-03 弗兰霍菲尔运输应用研究公司 便携式设备、虚拟现实系统及方法
CN104704535A (zh) * 2012-10-02 2015-06-10 索尼公司 增强现实系统
CN105183142A (zh) * 2014-06-13 2015-12-23 中国科学院光电研究院 一种利用空间位置装订的数字信息复现方法
US9599821B2 (en) * 2014-08-08 2017-03-21 Greg Van Curen Virtual reality system allowing immersion in virtual space to consist with actual movement in actual space
CN105373224A (zh) * 2015-10-22 2016-03-02 山东大学 一种基于普适计算的混合现实游戏系统及方法
CN107167132A (zh) * 2016-03-07 2017-09-15 上海积杉信息科技有限公司 基于增强现实与虚拟现实的室内定位系统

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113433837A (zh) * 2021-06-15 2021-09-24 浙江水利水电学院 一种基于vr的室内设计方法及系统
CN113433837B (zh) * 2021-06-15 2023-03-21 浙江水利水电学院 一种基于vr的室内设计方法及系统

Also Published As

Publication number Publication date
CN207488953U (zh) 2018-06-12

Similar Documents

Publication Publication Date Title
CN110163048B (zh) 手部关键点的识别模型训练方法、识别方法及设备
CN102959616B (zh) 自然交互的交互真实性增强
CN106652590B (zh) 教学方法、教学识别体及教学系统
CN106060520B (zh) 一种显示模式切换方法及其装置、智能终端
US20150378433A1 (en) Detecting a primary user of a device
US11769306B2 (en) User-exhibit distance based collaborative interaction method and system for augmented reality museum
CN103309226A (zh) 配合智能眼镜使用的智能手表
TW201835723A (zh) 圖形處理方法和裝置、虛擬實境系統和計算機儲存介質
CN109814719A (zh) 一种基于穿戴眼镜的显示信息的方法与设备
WO2020020102A1 (zh) 生成虚拟内容的方法、终端设备及存储介质
JP6656382B2 (ja) マルチメディア情報を処理する方法及び装置
CN108961375A (zh) 一种根据二维图像生成三维图像的方法及装置
CN113936085A (zh) 三维重建方法和装置
US20230419615A1 (en) Robotic learning of assembly tasks using augmented reality
WO2019109492A1 (zh) 虚拟现实场景模拟装置
CN109426336A (zh) 一种虚拟现实辅助选型设备
CN107632702B (zh) 一种采用光感应数据手套的全息投影系统及其工作方法
CN109871912A (zh) 虚拟现实场景模拟装置及方法
CN105893452B (zh) 一种呈现多媒体信息的方法及装置
CN106020468A (zh) 一种手套控制的增强现实系统
CN205880817U (zh) 一种ar和vr数据处理设备
CN114299263A (zh) 增强现实ar场景的展示方法及装置
CN112579029A (zh) Vr眼镜的显示控制方法和系统
EP3385869B1 (en) Method and apparatus for presenting multimedia information
CN113160337B (zh) 一种ar/vr虚拟现实融合相机控制方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18886043

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18886043

Country of ref document: EP

Kind code of ref document: A1