WO2020060569A1 - System and method for importing a software application into a virtual reality setting - Google Patents
System and method for importing a software application into a virtual reality setting Download PDFInfo
- Publication number
- WO2020060569A1 WO2020060569A1 PCT/US2018/052305 US2018052305W WO2020060569A1 WO 2020060569 A1 WO2020060569 A1 WO 2020060569A1 US 2018052305 W US2018052305 W US 2018052305W WO 2020060569 A1 WO2020060569 A1 WO 2020060569A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual reality
- virtual
- processor
- software application
- reality setting
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/06—Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
- G09B5/065—Combinations of audio and video presentations, e.g. videotapes, videodiscs, television systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/445—Program loading or initiating
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2350/00—Solving problems of bandwidth in display systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/02—Networking aspects
- G09G2370/022—Centralised management of display operation, e.g. in a server instead of locally
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/02—Networking aspects
- G09G2370/027—Arrangements and methods specific for the display of internet documents
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
- G09G2380/08—Biomedical applications
Definitions
- the subject matter of the present disclosure refers generally to a system and method for importing a software application into a virtual reality setting.
- Virtual reality provides endless opportunities to train professionals in various fields on how to use equipment they may encounter while on the job.
- a virtual reality environment may be made for multiple scenarios that more realistically simulate situations that professionals may find themselves in while operating in their chosen field. It also allows other professionals to view the training professional’s performance as they attempt to complete the training exercise without spatial proximity becoming an issue.
- the same scenario may be used to train multiple professionals multiple times. This may drastically reduce costs depending on the number of training simulations a group of professionals must undergo since each training scenario might otherwise need to be physically created. Waste is also greatly reduced for training exercises that require one time use materials. For instance, a medical training institution needing cadavers for each of their students may spend a considerable amount of money, whereas a single virtual reality simulation providing multiple cadavers in various conditions may provide multiple students a greater variety of training at a much lower cost.
- a system and method for importing software into a virtual reality setting is provided.
- the system generally comprises a computing device, control device operably connected to the computing device, computing entity, and display.
- the computing device comprises a processor, expansion port operably connected to the processor, power supply, and a non-transitory computer-readable medium coupled to the processor and having instructions stored thereon.
- the control device may be operably connected to the processor via the expansion port.
- the control device may be operably connected to the processor wirelessly via a communication interface of the computing entity.
- a database may be operably connected to the processor in a way such that it may store data created when the system is operated by a user. It is understood that the various method steps associated with the methods of the present disclosure may be carried out as operations by the system.
- a plurality of functions is coupled to the control device in a way that may cause a corresponding paired application and/or physical machine to perform an action.
- a user may activate the plurality of functions of the control device by manipulating the control device in way such that it sends an input to the system.
- the control device may transmit the input to the processor via the expansion port.
- the processor may then transform the functions into a digital input and/or a virtual reality input.
- the plurality of functions of the control device may be in the form of a driver.
- the driver facilitates communicate between the system and the control device, wherein the processor carries out the processes of the system and the plurality of functions of the driver.
- the processor may pair the input with a certain function of the driver. Once the processor pairs input from the control device with a function of the driver, the processor may relay the function to the paired application, which is a software application that has been created to work with a physical machine and/or control device.
- a user may navigate the virtual reality setting using a display and the control device.
- a virtual reality setting may comprise a virtual reality environment, a virtual screen, and a plurality of virtual reality objects.
- the virtual reality environment defines the boundaries of the virtual reality setting.
- a user may navigate a virtual reality setting by changing their virtual position within the virtual reality setting.
- the virtual reality environment may be populated by the plurality of virtual reality objects.
- the plurality of virtual reality objects may provide the virtual reality setting with characteristics of the physical setting in which the virtual reality setting is meant to emulate.
- the plurality of virtual reality objects may be manipulated by a user as the user interacts with the virtual reality setting.
- a user may interact with the virtual reality objects using a control device or some other form of equipment designed to allow a user to interact with the virtual reality setting.
- a virtual screen allows for data not part of the virtual reality setting to be streamed into the virtual reality setting.
- a virtual window may be coupled to a streaming application in a way such that data viewed within the streaming application may be viewed within the virtual screen.
- a virtual reality setting may have multiple virtual screens that may stream data from multiple streaming applications.
- a computing entity operably connected to the system may renders the virtual reality setting.
- the computing entity that renders the virtual reality setting preferably comprises a display, secondary processor operably connected to the display, a secondary power supply, and a secondary non-transitory computer-readable medium operably connected to the processor and having instructions stored thereon.
- the secondary processor of the computing entity preferably receives the virtual reality setting from the non-transitory computer-readable medium. Once received, the secondary processor may render the virtual reality setting and transmit the virtual reality setting to the display.
- the secondary processor may receive a stream from streaming application of the system and transmit that stream to the virtual screen within the virtual reality setting. A viewer may then view both the virtual reality setting and the stream within the virtual screen.
- FIG. 1 is a diagram of an example environment in which techniques described herein may be implemented.
- FIG. 2 is a diagram of an example environment in which techniques described herein may be implemented.
- FIG. 3 is a diagram of an example environment in which techniques described herein may be implemented.
- FIG. 4 is a diagram of an example environment in which techniques described herein may be implemented.
- FIG. 5 is an example virtual reality setting in which software may be integrated.
- FIG. 6 is a diagram illustrating the manner in which individual access to data may be granted or limited based on user or system roles.
- FIG. 7 is a flow chart illustrating certain method steps of a method embodying features consistent with the principles of the present disclosure.
- FIG. 8 is a flow chart illustrating certain method steps of a method embodying features consistent with the principles of the present disclosure.
- FIG. 9 is a flow chart illustrating certain method steps of a method embodying features consistent with the principles of the present disclosure.
- a system 400“comprising” components A, B, and C can contain only components A, B, and C, or can contain not only components A, B, and C, but also one or more other components.
- the defined steps can be carried out in any order or simultaneously (except where the context excludes that possibility), and the method can include one or more other steps which are carried out before any of the defined steps, between two of the defined steps, or after all the defined steps (except where the context excludes that possibility).
- a spatial position 470 may be defined a user’s 430 position within the physical world
- a virtual position 480 may be defined as a user’s 430 position with a virtual world.
- the present invention satisfies the need for a system 400 and method capable of importing a software application into a virtual reality setting 425 while simultaneously allowing a user 430 to use the control device 405 they might use in a real-life situation, and thereby improving upon known systems currently employed within the art.
- FIG. 1 depicts an exemplary environment 100 of the system 400 consisting of clients 105 connected to a server 110 and/or database 115 via a network 150
- Clients 105 are devices of users 430 that may be used to access servers 110 and/or databases 115 through a network 150
- a network 150 may comprise of one or more networks of any kind, including, but not limited to, a local area network (LAN), a wide area network (WAN), metropolitan area networks (MAN), a telephone network, such as the Public Switched Telephone Network (PSTN), an intranet, the Internet, a memory device, another type of network, or a combination of networks.
- LAN local area network
- WAN wide area network
- MAN metropolitan area networks
- PSTN Public Switched Telephone Network
- computing entities 200 may act as clients 105 for a user 430
- a client 105 may include a personal computer, a wireless telephone, a personal digital assistant (PDA), a laptop, a smart phone, a tablet computer, or another type of computation or communication device.
- Servers 110 may include devices that access, fetch, aggregate, process, search, provide, and/or maintain documents.
- FIG. 1 depicts a preferred embodiment of an environment 100 for the system 400, in other implementations, the environment 100 may contain fewer components, different components, differently arranged components, and/or additional components than those depicted in FIG. 1. Alternatively, or additionally, one or more components of the environment 100 may perform one or more other tasks described as being performed by one or more other components of the environment 100.
- one embodiment of the system 400 may comprise a server 110.
- a server 110 may, in some implementations, be implemented as multiple devices interlinked together via the network 150, wherein the devices may be distributed over a large geographic area and performing different functions or similar functions.
- two or more servers 110 may be implemented to work as a single server 110 performing the same tasks.
- one server 110 may perform the functions of multiple servers 110.
- a single server 110 may perform the tasks of a web server 110 and an indexing server 110.
- multiple servers 110 may be used to operably connect the processor 220 to the database 115 and/or other content repositories.
- the processor 220 may be operably connected to the server 110 via wired or wireless connection.
- Types of servers 110 that may be used by the system 400 include, but are not limited to, search servers 110, document indexing servers 110, and web servers 110, or any combination thereof.
- Search servers 110 may include one or more computing entities 200 designed to implement a search engine, such as a documents/records search engine, general webpage search engine, etc.
- Search servers 110 may, for example, include one or more web servers 110 designed to receive search queries and/or inputs from users 430, search one or more databases 115 in response to the search queries and/or inputs, and provide documents or information, relevant to the search queries and/or inputs, to users 430.
- search servers 110 may include a web search server 110 that may provide webpages to users 430, wherein a provided webpage may include a reference to a web server 110 at which the desired information and/or links are located.
- Document indexing servers 110 may include one or more devices designed to index documents available through networks 150. Document indexing servers 110 may access other servers 110, such as web servers 110 that host content, to index the content. In some implementations, document indexing servers 110 may index documents/records stored by other servers 110 connected to the network 150. Document indexing servers 110 may, for example, store and index content, information, and documents relating to user accounts and user-generated content. Web servers 110 may include servers 110 that provide webpages to clients 105. For instance, the webpages may be HTML- based webpages. A web server 110 may host one or more websites.
- a website may refer to a collection of related webpages. Frequently, a website may be associated with a single domain name, although some websites may potentially encompass more than one domain name.
- the concepts described herein may be applied on a per-website basis. Alternatively, in some implementations, the concepts described herein may be applied on a per-webpage basis.
- a database 115 refers to a set of related data and the way it is organized.
- DBMS database 115 management system 400
- the DBMS provides various functions that allow entry, storage and retrieval of large quantities of information and provides ways to manage how that information is organized. Because of the close relationship between the database 115 and the DBMS, as used herein, the term database 115 refers to both a database 115 and DBMS.
- FIG. 2 is an exemplary diagram of a client, server, and/or or database 115 (hereinafter called“computing entity”), which may correspond to one or more of the clients 105, servers 110, and databases 115 according to an implementation consistent with the principles of the invention as described herein.
- the computing entity 200 may comprise bus 210, a processor 220, memory 304, a storage device 250, a peripheral device 270, and a communication interface 280.
- the bus 210 may be defined as one or more conductors that permit communication among the components of the computing entity 200.
- the processor 220 may be defined as a logic circuitry that responds to and processes the basic instructions that drive the computing entity 200.
- Memory 304 may be defined as the integrated circuitry that stores information for immediate use in a computing entity 200.
- a peripheral device 270 may be defined as any hardware used by a user 430 and/or the computing entity 200 to facilitate communicate between the two.
- a storage device 250 may be defined as a device used to provide mass storage to a computing entity 200.
- a communication interface 280 may be defined as any transceiver-like device that enables the computing entity 200 to communicate with other devices and/or computing entities 200.
- the bus 210 may comprise a high-speed interface 308 and/or a low-speed interface 312 that connects the various components together in a way such they may communicate with one another.
- a high-speed interface 308 manages bandwidth-intensive operations for computing device 300, while a low-speed interface 312 manages lower bandwidth-intensive operations.
- the high-speed interface 308 of a bus 210 may be coupled to the memory 304, display 316, and to high-speed expansion ports 310, which may accept various expansion cards such as a graphics processing unit (GPU).
- the low-speed interface 312 of a bus 210 may be coupled to a storage device 250 and low-speed expansion ports 314.
- the low-speed expansion ports 314 may include various communication ports, such as USB, Bluetooth, Ethernet, wireless Ethernet, etc. Additionally, the low-speed expansion ports 314 may be coupled to one or more peripheral devices 270, such as a keyboard, pointing device, scanner, and/or a networking device, wherein the low-speed expansion ports 314 facilitate the transfer of input data from the peripheral devices 270 to the processor 220 via the low-speed interface 312.
- peripheral devices 270 such as a keyboard, pointing device, scanner, and/or a networking device
- the processor 220 may comprise any type of conventional processor 220 or microprocessor 220 that interprets and executes computer readable instructions.
- the processor 220 is configured to perform the operations disclosed herein based on instructions stored within the system 400.
- the processor 220 may process instructions for execution within the computing entity 200, including instructions stored in memory 304 or on a storage device 250, to display graphical information for a graphical user interface (GUI) on an external peripheral device 270, such as a display 316.
- GUI graphical user interface
- the processor 220 may provide for coordination of the other components of a computing entity 200, such as control of user interfaces, applications run by a computing entity 200, and wireless communication by a communication device of the computing entity 200.
- the processor 220 may be any processor 220 or microprocessor 220 suitable for executing instructions.
- the processor 220 may have a memory device therein or coupled thereto suitable for storing the data, content, or other information or material disclosed herein.
- the processor 220 may be a component of a larger computing entity 200.
- a computing entity 200 that may house the processor 220 therein may include, but are not limited to, laptops, desktops, workstations, personal digital assistants, servers 110, mainframes, cellular telephones, tablet computers, or any other similar device. Accordingly, the inventive subject matter disclosed herein, in full or in part, may be implemented or utilized in devices including, but are not limited to, laptops, desktops, workstations, personal digital assistants, servers 110, mainframes, cellular telephones, tablet computers, or any other similar device.
- Memory 304 stores information within computing device 300
- memory 304 may include one or more volatile memory units.
- memory 304 may include one or more non-volatile memory units.
- Memory 304 may also include another form of computer-readable medium, such as a magnetic or optical disk. For instance, a portion of a magnetic hard drive may be partitioned as a dynamic scratch space to allow for temporary storage of information that may be used by the processor 220 when faster types of memory, such as random-access memory (RAM), are in high demand.
- a computer-readable medium may refer to a non-transitory computer-readable memory device.
- a memory device may refer to storage space within a single storage device 250 or spread across multiple storage devices 250
- the memory 304 may comprise main memory 230 and/or read only memory (ROM) 240
- the main memory 230 may comprise RAM or another type of dynamic storage device 250 that stores information and instructions for execution by the processor 220
- ROM 240 may comprise a conventional ROM device or another type of static storage device 250 that stores static information and instructions for use by processor 220
- the storage device 250 may comprise a magnetic and/or optical recording medium and its corresponding drive.
- a peripheral device 270 is a device that facilitates communication between a user 430 and the processor 220
- the peripheral device 270 may include, but is not limited to, an input device and/or an output device.
- an input device may be defined as a device that allows a user 430 to input data and instructions that is then converted into a pattern of electrical signals in binary code that are comprehensible to a computing entity 200.
- An input device of the peripheral device 270 may include one or more conventional devices that permit a user 430 to input information into the computing entity 200, such as a scanner, phone, camera, scanning device, keyboard, a mouse, a pen, voice recognition and/or biometric mechanisms, etc.
- an output device may be defined as a device that translates the electronic signals received from a computing entity 200 into a form intelligible to the user 430.
- An output device of the peripheral device 270 may include one or more conventional devices that output information to a user 430, including a display 316, a printer, a speaker, an alarm, a projector, etc.
- storage devices 250, such as CD-ROM drives, and other computing entities 200 may act as a peripheral device 270 that may act independently from the operably connected computing entity 200.
- a fitness tracker may transfer data to a smartphone, wherein the smartphone may use that data in a manner separate from the fitness tracker.
- the storage device 250 is capable of providing the computing entity 200 mass storage.
- the storage device 250 may comprise a computer-readable medium such as the memory 304, storage device 250, or a memory 304 on processor 220.
- a computer-readable medium may be defined as one or more physical or logical memory devices and/or carrier waves. Devices that may act as a computer readable medium include, but are not limited to, a hard disk device, optical disk device, tape device, flash memory or other similar solid-state memory device, or an array of devices, including devices in a storage area network or other configurations.
- Examples of computer-readable mediums include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM discs and DVDs; magneto-optical media such as optical discs; and hardware devices that are specially configured to store and perform programming instructions, such as ROM 240, RAM, flash memory, and the like.
- a computer program may be tangibly embodied in the storage device
- the computer program may contain instructions that, when executed by the processor 220, performs one or more steps that comprise a method, such as those methods described herein.
- the instructions within a computer program may be carried to the processor 220 via the bus 210.
- the computer program may be carried to a computer-readable medium, wherein the information may then be accessed from the computer-readable medium by the processor 220 via the bus 210 as needed.
- the software instructions may be read into memory 304 from another computer-readable medium, such as data storage device 250, or from another device via the communication interface 280.
- hardwired circuitry may be used in place of or in combination with software instructions to implement processes consistent with the principles as described herein. Thus, implementations consistent with the invention as described herein are not limited to any specific combination of hardware circuitry and software.
- FIG. 3 depicts exemplary computing entities 200 in the form of a computing device 300 and mobile computing device 350, which may be used to carry out the various embodiments of the invention as described herein.
- a computing device 300 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers 110, databases 115, mainframes, and other appropriate computers.
- a mobile computing device 350 is intended to represent various forms of mobile devices, such as scanners, scanning devices, personal digital assistants, cellular telephones, smart phones, tablet computers, and other similar devices.
- the various components depicted in FIG. 3, as well as their connections, relationships, and functions are meant to be examples only, and are not meant to limit the implementations of the invention as described herein.
- the computing device 300 may be implemented in a number of different forms, as shown in FIGS. 1 and 3.
- a computing device 300 may be implemented as a server 110 or in a group of servers 110.
- Computing devices 300 may also be implemented as part of a rack server 110 system 400.
- a computing device 300 may be implemented as a personal computer, such as a desktop computer or laptop computer.
- components from a computing device 300 may be combined with other components in a mobile device, thus creating a mobile computing device 350.
- Each mobile computing device 350 may contain one or more computing devices 300 and mobile devices, and an entire system may be made up of multiple computing devices 300 and mobile devices communicating with each other as depicted by the mobile computing device 350 in FIG. 3.
- the computing entities 200 consistent with the principles of the invention as disclosed herein may perform certain receiving, communicating, generating, output providing, correlating, and storing operations as needed to perform the various methods as described in greater detail below.
- a computing device 300 may include a processor 220, memory 304, a storage device 250, high-speed expansions ports 310, low-speed expansion ports 314, and bus 210 operably connecting the processor 220, memory 304, storage device 250, high-speed expansions ports 310, and low-speed expansion ports 314.
- the bus 210 may comprise a high-speed interface 308 connecting the processor 220 to the memory 304 and high-speed expansion ports 310 as well as a low-speed interface 312 connecting to the low-speed expansion ports 314 and the storage device 250. Because each of the components are interconnected using the bus 210, they may be mounted on a common motherboard as depicted in FIG.
- the processor 220 may process instructions for execution within the computing device 300, including instructions stored in memory 304 or on the storage device 250. Processing these instructions may cause the computing device 300 to display graphical information for a graphical user interface (GUI) on an output device, such as a display 316 coupled to the high-speed interface 308.
- GUI graphical user interface
- multiple processors and/or multiple buses may be used, as appropriate, along with multiple memory units and/or multiple types of memory.
- multiple computing devices may be connected, wherein each device provides portions of the necessary operations.
- a mobile computing device 350 may include a processor 220, memory 304, a peripheral device 270 (such as a display 316), a communication interface 280, and a transceiver 368, among other components.
- a mobile computing device 350 may also be provided with a storage device 250, such as a micro-drive or other previously mentioned storage device 250, to provide additional storage.
- a storage device 250 such as a micro-drive or other previously mentioned storage device 250, to provide additional storage.
- each of the components of the mobile computing device 350 are interconnected using a bus 210, which may allow several of the components of the mobile computing device 350 to be mounted on a common motherboard as depicted in FIG. 3 or in other manners as appropriate.
- a computer program may be tangibly embodied in an information carrier.
- the computer program may contain instructions that, when executed by the processor 220, perform one or more methods, such as those described herein.
- the information carrier is preferably a computer- readable medium, such as memory, expansion memory 374, or memory 304 on the processor 220 such as ROM 240, that may be received via the transceiver or external interface 362.
- the mobile computing device 350 may be implemented in a number of different forms, as shown in FIG. 3.
- a mobile computing device 350 may be implemented as a cellular telephone, part of a smart phone, personal digital assistant, or other similar mobile device.
- the processor 220 may execute instructions within the mobile computing device 350, including instructions stored in the memory 304 and/or storage device 250.
- the processor 220 may be implemented as a chipset of chips that may include separate and multiple analog and/or digital processors.
- the processor 220 may provide for coordination of the other components of the mobile computing device 350, such as control of the user interfaces, applications run by the mobile computing device 350, and wireless communication by the mobile computing device 350.
- the processor 220 of the mobile computing device 350 may communicate with a user 430 through the control interface 358 coupled to a peripheral device 270 and the display interface 356 coupled to a display 316.
- the display 316 of the mobile computing device 350 may include, but is not limited to, Liquid Crystal Display (LCD), Light Emitting Diode (LED) display, Organic Light Emitting Diode (OLED) display, and Plasma Display Panel (PDP), or any combination thereof.
- LCD Liquid Crystal Display
- LED Light Emitting Diode
- OLED Organic Light Emitting Diode
- PDP Plasma Display Panel
- the display interface 356 may include appropriate circuitry for causing the display 316 to present graphical and other information to a user 430.
- the control interface 358 may receive commands from a user 430 via a peripheral device 270 and convert the commands into a computer readable signal for the processor 220.
- an external interface 362 may be provided in communication with processor 220, which may enable near area communication of the mobile computing device 350 with other devices.
- the external interface 362 may provide for wired communications in some implementations or wireless communication in other implementations. In a preferred embodiment, multiple interfaces may be used in a single mobile computing device 350 as is depicted in FIG. 3.
- Memory 304 stores information within the mobile computing device 350.
- Devices that may act as memory 304 for the mobile computing device 350 include, but are not limited to computer-readable media, volatile memory, and non-volatile memory.
- Expansion memory 374 may also be provided and connected to the mobile computing device 350 through an expansion interface 372, which may include a Single In-Line Memory Module (SIM) card interface or micro secure digital (Micro-SD) card interface.
- Expansion memory 374 may include, but is not limited to, various types of flash memory and non-volatile random-access memory (NVRAM). Such expansion memory 374 may provide extra storage space for the mobile computing device 350.
- expansion memory 374 may store computer programs or other information that may be used by the mobile computing device 350.
- expansion memory 374 may have instructions stored thereon that, when carried out by the processor 220, cause the mobile computing device 350 perform the methods described herein. Further, expansion memory 374 may have secure information stored thereon; therefore, expansion memory 374 may be provided as a security module for a mobile computing device 350, wherein the security module may be programmed with instructions that permit secure use of a mobile computing device 350. In addition, expansion memory 374 having secure applications and secure information stored thereon may allow a user 430 to place identifying information on the expansion memory 374 via the mobile computing device 350 in a non-hackable manner.
- a mobile computing device 350 may communicate wirelessly through the communication interface 280, which may include digital signal processing circuitry where necessary.
- the communication interface 280 may provide for communications under various modes or protocols, including, but not limited to, Global System Mobile Communication (GSM), Short Message Services (SMS), Enterprise Messaging System (EMS), Multimedia Messaging Service (MMS), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Personal Digital Cellular (PDC), Wideband Code Division Multiple Access (WCDMA), IMT Multi-Carrier (CDMA2000) , and General Packet Radio Service (GPRS), or any combination thereof.
- GSM Global System Mobile Communication
- SMS Short Message Services
- EMS Enterprise Messaging System
- MMS Multimedia Messaging Service
- CDMA Code Division Multiple Access
- TDMA Time Division Multiple Access
- PDC Personal Digital Cellular
- WCDMA Wideband Code Division Multiple Access
- IMT Multi-Carrier CDMA2000
- GPRS General Packet Radio Service
- Short-range communication may occur, such as using a Bluetooth, WIFI, or other such transceiver 368.
- a Global Positioning System (GPS) receiver module 370 may provide additional navigation-and location- related wireless data to the mobile computing device 350, which may be used as appropriate by applications running on the mobile computing device 350.
- the mobile computing device 350 may communicate audibly using an audio codec 360, which may receive spoken information from a user 430 and covert the received spoken information into a digital form that may be processed by the processor 220.
- the audio codec 360 may likewise generate audible sound for a user 430, such as through a speaker, e.g., in a handset of mobile computing device 350.
- Such sound may include sound from voice telephone calls, recorded sound such as voice messages, music files, etc. Sound may also include sound generated by applications operating on the mobile computing device 350.
- FIGS. 4-9 illustrate embodiments of a system 400 for importing a paired application 415A into a virtual reality setting 425.
- FIG. 4 shows an embodiment of the disclosed system 400.
- the system 400 generally comprises a computing device 300, control device 405 operably connected to the computing device 300, computing entity 200, and display 316.
- the computing device 300 comprises a processor 220, expansion port 310, 314 operably connected to the processor 220, power supply, and a non-transitory computer-readable medium 402 coupled to the processor 220 and having instructions stored thereon.
- the control device 405 may be operably connected to the processor 220 via the expansion port 310, 314.
- control device 405 may be operably connected to the processor 220 wirelessly via a communication interface 280 of the computing entity 200.
- a database 115 may be operably connected to the processor 220 in a way such that it may store data created when the system 400 is operated by a user 430. It is understood that the various method steps associated with the methods of the present disclosure may be carried out as operations by the system 400 shown in FIG. 4.
- FIGS. 7-9 illustrate various methods that may be carried out by the system 400.
- FIG. 6 illustrates permission levels 600 that may be utilized by the system 400 for controlling access to user content 315, 335, 355.
- FIG. 5 illustrates an example virtual reality setting 425 that may be presented via a display 316 such as a virtual reality headset.
- a control device 405 is defined as a device used to control a physical machine and/or its corresponding paired application 415A.
- a plurality of functions 410 is coupled to the control device 405 in a way that may cause a corresponding paired application 415A and/or physical machine to perform an action.
- a user 430 may activate the plurality of functions 410 of the control device 405 by manipulating the control device 405 in way such that it sends an input to the system 400. For instance, a user 430 may activate a switch of a control device 405 operably connected to a crane that causes an input to be sent to the system 400, which in turn causes the crane to elevate a hoist.
- a user 430 may toggle the directional controls of a surgical robot in a way that causes an input to be sent to the system 400, which in turn causes the surgical robot to move its robotic appendages.
- a user 430 may press a touchscreen of a control device 405 operably connected to a light system 400 in a way that causes an input to be sent to the system 400, which in turn causes the light system 400 to darken.
- the control device 405 may transmit the input to the processor 220 via the expansion port 310, 314.
- the processor 220 may then transform the function into a digital input and/or a virtual reality input.
- a digital input may be defined as a computer readable signal used by the system 400 to manipulate the paired application 415A.
- a virtual reality input may be defined as a computer readable signal used by the system 400 to manipulate the virtual reality setting 425.
- the system 400 may use the same functions within the plurality of functions 410 of a control device 405 to manipulate both the paired application 415A and virtual reality setting 425.
- the plurality of functions 410 of the control device 405 may be in the form of a driver.
- a driver may be defined as a software component that allows the control device 405 to communicate with the rest of the system 400.
- a paired application 415A requiring input from a control device 405 may call a process implemented by the system 400, which may call for a function implemented by a driver of control device 405.
- the driver facilitates communicate between the system 400 and the control device 405, wherein the processor 220 carries out the processes of the system 400 and the plurality of functions 410 of the driver.
- the processor 220 may pair the input with a certain function of the driver.
- a driver may be used to transform the input from the control device 405 into a digital input and/or virtual reality input.
- more than one driver may be used to communicate with a control device 405.
- These drivers may be layered in a driver stack, wherein each driver within the driver stack may perform a certain task. For instance, a stack having three drivers may have a first driver that communicates with the operating system 400 and a second driver that communicates with the control device 405.
- the third driver may act as an intermediary between the first driver and second driver, wherein the third driver receives the plurality of functions 410 from the second driver and converts them into a digital input before relaying the digital input to the first driver.
- the first driver may then relay the digital input to the operating system 400, which may then relay the digital input to the paired application 415A.
- a paired application 415A may be defined as a software application that has been created to work with a physical machine and/or control device 405.
- a physical machine may be defined as a mechanical device that a user 430 may operate to accomplish a specific task.
- a vascular robot designed to assist medical professionals implant a stent within a patient may be coupled with a paired application 415A that allows a user 430 to control the vascular robot.
- farm equipment designed to assist workers tend to crops may be coupled with a paired application 415A that allows a user 430 to control various functions of the farm equipment.
- a mobile computing device 350 may be coupled with a paired application 415A that allows a user 430 to perform various tasks.
- the control device 405 may control various functions of the physical machine without the assistance of the paired application.
- the paired application may control every aspect of the physical machine. Control of the physical machine may be accomplished by using the software via manipulation of the control device 405.
- physical machines may have a virtual embodiment within a virtual reality setting 425.
- the virtual embodiment of the virtual reality setting 425 may be manipulated by a control device 405 operably connected to the system 400, wherein the plurality of functions 410 of the control device 405 may be transformed into virtual reality inputs that alter the virtual reality setting 425.
- a virtual reality setting 425 is defined as an artificial, interactive, computer-created scene or 'world' within which a user 430 may immerse themselves.
- the user 430 may navigate the virtual reality setting 425 using the display 316 and the control device 405.
- a virtual reality setting 425 may comprise a virtual reality environment 425 A, a virtual screen 425B, and a plurality of virtual reality objects 425C.
- the virtual reality environment 425A defines the boundaries of the virtual reality setting 425.
- a virtual reality setting 425 has no physical boundaries and must therefore have virtual boundaries.
- the size of a virtual reality environment 425A is relative to that of a physical environment, wherein the physical environment may be defined by a plurality of spatial positions 470 and the virtual reality environment 425A may be defined by a plurality of virtual position 480.
- a virtual reality setting 425 comprising a surgical room may have a square boundary that appears to be similar to that in size of a physical version of a surgical room from the point of view of the user 430 as viewed through the display 316.
- a user 430 may not be able to view the boundaries of the environment.
- a user 430 having a virtual position 480 in the center of a virtual reality setting 425 comprising one square mile of densely forested land may not see the boundaries of the virtual reality setting 425, thus the virtual reality setting 425 may appear boundaryless.
- a user 430 may navigate a virtual reality setting 425 by changing their virtual position 480 within the virtual reality setting 425.
- the system 400 may alter the virtual position 480 of a user 430 by measuring changes in a user’s 430 spatial position 470 and converting those changes into a changes in the user’s 430 virtual position 480.
- a user 430 wearing a virtual reality headset measuring the spatial position 470 of the user 430 may transmit the spatial position 470 the system 400, wherein the system 400 may convert the changed spatial position 470 into a change in the user’s 430 virtual position 480.
- a user 430 may use a control device 405 to alter their virtual position 480 within a virtual reality setting 425.
- the virtual reality environment 425A may be populated by the plurality of virtual reality objects 425C.
- the plurality of virtual reality objects 425C may provide the virtual reality setting 425 with characteristics of the physical setting in which the virtual reality setting 425 is meant to emulate.
- a virtual reality setting 425 created to emulate a surgical room may be populated with a floor, walls, ceiling, examination table, and a viewing window.
- a virtual reality setting 425 created to emulate a construction site may be populated with a fence, dirt, and a bulldozer.
- a virtual reality setting 425 created to emulate a beach resort may have sand, beach chairs, volleyball net, and an ocean.
- the plurality of virtual reality objects 425C may comprise a virtual embodiment of a physical machine.
- the plurality of virtual reality objects 425C may be manipulated by a user 430 as the user 430 interacts with the virtual reality setting 425.
- a user 430 may interact with the virtual reality objects 425C using a control device 405 or some other form of equipment designed to allow a user 430 to interact with the virtual reality setting 425.
- a user 430 may wear haptic gloves that allow a user 430 to interact with objects within a virtual reality setting 425.
- wearing a virtual reality headset measuring the spatial position 470 may interact with the environment by changing their virtual position 480 within the virtual reality setting 425.
- the user 430 may interact with the virtual embodiment in the virtual reality setting 425 as they would be able to interact with a physical machine in a physical setting.
- the plurality of virtual reality obj ects 425C that populate the virtual reality environment 425A may comprise a virtual reality display.
- the virtual reality display may be coupled to a virtual screen 425B in a way such that manipulating the virtual reality display in a way that changes its virtual position 480 within the virtual reality setting 425 may also manipulate the virtual position 480 of the virtual screen 425B.
- the virtual reality display may be defined as a virtual reality object 425C within a virtual reality setting 425 that is coupled to the virtual screen 425B.
- a virtual screen 425B may be defined as a virtual window that allows for data not part of the virtual reality setting 425 to be streamed into the virtual reality setting 425.
- a virtual window may be coupled to a streaming application 415 in a way such that data viewed within the streaming application 415 may be viewed within the virtual screen 425B.
- a streaming application 415 designed to execute a paired application 415A may be operably connected to a virtual screen 425B within a virtual reality setting 425.
- a user 430 within the virtual reality setting 425 may view the paired application 415A as it is being executed within streaming application 415 even though the paired application 415A is not part of the virtual reality setting 425.
- the data streamed by the streaming application 415 to a virtual screen 425B in a virtual reality setting 425 is video data, wherein a user 430 within the virtual reality setting 425 may view the video data within the virtual screen 425B.
- a virtual reality setting 425 may have multiple virtual screens 425B that may stream data from multiple streaming applications 415.
- a virtual reality setting 425 comprising a sports bar may have multiple virtual screens 425B coupled to multiple virtual reality displays that are designed to look like flat screen televisions.
- Each virtual screen 425B within the plurality of virtual screens 425B may stream a different sports related event depending on the data streamed from the streaming applications 415 operably connected to the plurality of virtual screens 425B.
- a user 430 may choose which virtual screen 425B amongst the plurality of virtual screens 425B to watch by altering their virtual position 480 within the virtual reality setting 425.
- the streaming application 415 operably connected to the virtual screen 425B of a virtual reality setting 425 may be executed by the system 400 or by a computing entity 200 operably connected to the system 400.
- a computing entity 200 operably connected to the system 400 renders the virtual reality setting 425.
- the computing entity 200 that renders the virtual reality setting 425 preferably comprises a display 316, secondary processor 220 operably connected to the display 316, a secondary power supply, and a secondary non-transitory computer-readable medium 402 having instructions stored therein.
- the secondary processor 220 of the computing entity 200 preferably receives the virtual reality setting 425 from the non-transitory computer-readable medium 402. Once received, the secondary processor 220 may render the virtual reality setting 425 and transmit the virtual reality setting 425 to the display 316.
- the display 316 is a virtual reality headset as depicted in FIG. 4.
- the virtual reality headset may present the virtual reality setting 425 to the user 430.
- the secondary processor 220 may receive a stream from system 400 and transmit that stream to the virtual screen 425B of the virtual reality setting 425. A viewer may then view both the virtual reality setting 425 and the stream within the virtual screen 425B.
- the virtual screen 425B may stream the paired application 415A running within the streaming application 415 of the system 400.
- the virtual reality setting 425 may present a user 430 with a virtual scenario.
- a virtual scenario may be defined as a situation in which a user 430 must perform a series of actions to complete a task presented by the virtual reality setting 425.
- a user 430 may be presented a virtual scenario in which they are asked to perform a surgical procedure using a surgical robot.
- a user 430 may be may be presented a virtual scenario in which they are asked to operate computerized farm equipment in a way such that it plows a field in a desired pattern within the virtual reality setting 425.
- Information streamed into the virtual screen 425B may assist the user 430 to complete the virtual scenario.
- a user 430 within a virtual scenario involving a surgical procedure may view a virtual screen 425B of an instructor giving a step by step instruction on how to properly perform the surgical procedure.
- the virtual scenario may be designed for a particular paired application 415A.
- a virtual scenario designed to train a pilot to fly an Airbus 210 A380 may comprise a scenario inside the cockpit of an Airbus 210 380, wherein the virtual embodiment of the cockpit may have a plurality of virtual screens 425B streaming the paired applications 415A of the Airbus 210 A380.
- actions taken by the user 430 within the virtual reality setting 425 may be recorded by the computing entity 200.
- results of the virtual scenario may be recorded to allow other users 430 to analyze progress made via training. For instance, a virtual scenario may grade the performance of a user 430 by comparing a user’s 430 actions with the predetermined desired actions of the virtual scenario. An instructor may then use these results to provide further instruction to a user 430 when needed.
- the system 400 may record the user 430 within the virtual scenario. In one preferred embodiment, the recording may be from the user’s 430 point of view within the virtual reality setting 425 as seen through the display 316.
- the recording may be of the virtual reality setting 425 itself, wherein every aspect of the virtual reality setting 425 is recorded as the user 430 interacts with the virtual scenario so that an instructor may view a user’s 430 actions within the virtual scenario from any perspective within the virtual reality environment 425A.
- a recording of a virtual scenario involving combat training may allow an instructor to view the actions of the user 430 from various virtual positions 480 within the virtual reality setting 425.
- the recorded virtual scenario results of a user 430 may be stored within the computing entity 200.
- the training results may be stored within a user profile 420 on a storage device 250.
- a user profile 420 may be defined as a digital folder that contains information pertaining to a particular user 430.
- a user profile 420 may contain personal information such as name, date of birth, photograph, etc.
- a user profile 420 may contain information pertaining to training scenarios they have undertaken.
- the information within a user profile 420 may be organized using various files and folders. For instance, personal information may be stored in a separate folder from information regarding training scenarios.
- the system 400 may store the information within a non-transitory computer-readable medium 402 of the computing entity 200.
- the information may be stored on a database 115 operably connected to the processor 220, wherein access to the database 115 may be limited based on various permission levels 600.
- a server 110 may be operably connected to the processor 220 and database 115 to the facilitate the transfer of data between the computing entity 200 and database.
- the system 400 and method of the present disclosure may use a plurality user profiles 420 stored within a database.
- Each user profile 420 may be constructed using resulting data from actions taken by users 430 during a virtual scenario.
- each virtual scenario may have a number of performance factors associated therewith that corresponds to an aspect of a professional’s performance.
- Each performance factor has a plurality of defined performance limits associated therewith.
- the system 400 may grade a user’s 430 performance within the scenario. In this way, the performance factors and performance limits associated therewith may be used to define a performance grade that a particular user 430 achieved during a particular virtual scenario.
- Performance factors tied to a particular virtual scenario may depend on the type of training and the objective of the virtual scenario. For instance, a virtual scenario involving implanting a heart stent may take speed, location, and invasiveness into consideration when determining the performance grade of a user 430. For instance, a virtual scenario involving flying an airplane may determine a user’s 430 performance grade based on takeoff and landing results.
- the programming instructions responsible for the operations carried out by the processor 220 are stored on a non-transitory computer-readable medium 402 (“CRM”), which may be coupled to the processor 220, as shown in FIG. 4.
- the programming instructions may be stored or included within the processor 220.
- Examples of non-transitory computer-readable medium 402s include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM discs and DVDs; magneto-optical media such as optical discs; and hardware devices that are specifically configured to store and perform programming instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like.
- the programming instructions may be stored as modules within the non-transitory computer-readable medium 402.
- computing entities 200 may communicate audibly, meaning computing entities 200 may transmit and receive information via sound waves and covert the sound waves into digital information.
- a user 430 may instruct a user interface of a computing entity 200 with their voice to perform a certain action.
- the processor 220 may convert the sound waves of the user 430 into instructions, which the processor 220 may then carry out.
- a user 430 may audibly communicate with the system 400 in a way that causes the system 400 to alter a user’ s 430 virtual position 480 within a virtual reality setting 425.
- Computing entities 200 may likewise generate audible sound for a user 430, such as through an audio device. Such sound may include sound from voice telephone calls, recorded notes, voice messages, music files, etc.
- Audible sounds may also include sound generated by applications operating on a computing entity 200.
- an application running on a mobile computing entity 200 may be configured in a way such that when a certain condition is met the application causes the mobile computing entity 200 to output a sound.
- an application may be configured in a way such that an alarming sound is emitted via an audio device connected to the computing entity 200 at a certain time of day.
- the processor 220 may receive a signal indicating that the user 430 is about to make or has made a mistake during the virtual scenario while using the control device 405. The processor 220 may then convert this signal into an audio message that may be sent to an audio device to make the user 430 aware of the mistake.
- the system 400 may further comprise a user interface.
- a user interface may be defined as a space where interactions between a user 430 and the system 400 may take place. In an embodiment, the interactions may take place in a way such that a user 430 may control the operations of the system 400.
- a user interface may include, but is not limited to operating systems 400, command line user interfaces, conversational interfaces, web-based user interfaces, zooming user interfaces, touch screens, task-based user interfaces, touch user interfaces, text-based user interfaces, intelligent user interfaces, and graphical user interfaces, or any combination thereof.
- the system 400 may present data of the user interface to the user 430 via a display 316 operably connected to the processor 220.
- a display 316 may be defined as an output device that communicates data that may include, but is not limited to, visual, auditory, cutaneous, kinesthetic, olfactory, and gustatory, or any combination thereof.
- Information presented via a display 316 may be referred to as a soft copy of the information because the information exists electronically and is presented for a temporary period of time.
- Information stored on the non-transitory computer-readable medium 402 may be referred to as the hard copy of the information.
- a display 316 may present a soft copy of visual information via a liquid crystal display (LCD), wherein the hardcopy of the visual information is stored on a local hard drive.
- a display 316 may present a soft copy of audio information via a speaker, wherein the hard copy of the audio information is stored on a flash drive.
- LCD liquid crystal display
- a display 316 may present a soft copy of tactile information via a haptic suit, wherein the hard copy of the tactile information is stored within a database.
- Displays 316 may include, but are not limited to, cathode ray tube monitors, LCD monitors, light emitting diode (LED) monitors, gas plasma monitors, screen readers, speech synthesizers, haptic suits, virtual reality headsets, speakers, and scent generating devices, or any combination thereof.
- the system 400 may comprise a power supply.
- the power supply may be any source of power that provides the system 400 with electricity.
- the power supply may be a stationary power outlet.
- the system 400 may comprise of multiple power supplies that may provide power to the system 400 in different circumstances. For instance, the system 400 may be directly plugged into a stationary power outlet, which may provide power to the system 400 so long as it remains in one place. However, the system 400 may also be connected to a backup battery so that the system 400 may receive power even when the it is not connected to a stationary power outlet.
- the security method of the system 400 may comprise a plurality of permission levels 600 that may grant user’s 430 access to user content 615, 635, 655 within the database 115 while simultaneously denying users without appropriate permission levels 600 the ability to view user content 615, 635, 655.
- users 430 may be required to make a request via a user interface. Access to the data within the database 115 may be granted or denied by the processor 220 based on verification of a requesting user’s 605, 625, 645 permission level 600.
- the processor 220 may provide the requesting user 605, 625, 645 access to user content 315, 335, 355 stored within the database 115 615. Conversely, if the requesting user’s 605, 625, 645 permission level 600 is insufficient, the processor 220 may deny the requesting user 605, 625, 645 access to user content 615, 635, 655 stored within the database 115 615.
- permission levels 600 may be based on user roles 610, 630, 650 and administrator roles 670, as illustrated in FIG. 6.
- User roles 610, 630, 650 allow requesting users 605, 625, 645 to access user content 615, 635, 655 that a user 430 has uploaded and/or otherwise obtained through use of the system 400.
- Administrator roles 670 allow administrators 665 to access system 400 wide data.
- user roles 610, 630, 650 may be assigned to a user in a way such that a requesting user 605, 625, 645 may view user profiles 420 containing virtual scenario results and personal information via a user interface.
- the system 400 may be configured to send an instructor a notification indicating that a user has obtained new virtual scenario results.
- a user may make a user request via the user interface to the processor 220.
- the processor 220 may grant or deny the request based on the permission level 600 associated with the requesting user 605, 625, 645. Only users having appropriate user roles 610, 630, 650 or administrator roles 670 may access the data within the user profiles 420.
- requesting user 1 605 has permission to view user 1 content 615 and user 2 content 635 whereas requesting user 2 625 only has permission to view user 2 content 335.
- user content 615, 635, 655 may be restricted in a way such that a user may only view a limited amount of user content 615, 635, 655.
- requesting user 3 645 may be granted a permission level 600 that only allows them to view user 3 content 655 related to personal information but not user 3 content 655 related to virtual scenario results.
- an administrator 665 may bestow a new permission level 600 on users so that it may grant them greater permissions or lesser permissions.
- an administrator 665 may bestow a greater permission level 600 on other users so that they may view user 3’ s content 655 and/or any other user’ s content 615, 635, 655. Therefore, the permission levels 600 of the system 400 may be assigned to users 430 in various ways without departing from the inventive subject matter described herein.
- FIG. 7 provides a flow chart 700 illustrating certain, preferred method steps that may be used to carry out the method of importing a paired application 415A into a virtual reality setting 425.
- Step 705 indicates the beginning of the method.
- the processor 220 may execute the application software within the streaming application 415.
- the processor 220 may capture video data of the paired application 415A running within the streaming application 415 during step 715.
- the processor 220 may render the virtual reality setting 425 during step 712 wherein the processor 220 may subsequently stream the previously captured video data to the virtual reality setting 425 during step 720.
- the virtual reality setting 425 may be rendered on a computing entity 200 operably connected to the system 400, wherein the paired application 415A may be imported into the virtual reality setting 425 rendered by the computing entity 200 via the processor 220.
- the processor 220 may display the data within the virtual screen 425B of the virtual reality setting 425 during step 725.
- the processor 220 may then transmit the virtual reality setting 425 and the stream running within the virtual screen 425B to a user 430 during step 730.
- the display 316 may present the virtual reality setting 425 and the paired application 415A within the virtual screen 425B to the user 430 during step 735.
- the method may proceed to the terminate method step 740.
- FIG. 8 provides a flow chart 800 illustrating certain, preferred method steps that may be used to carry out the method of receiving an input from a control device 405 and transforming that input into a digital input and virtual reality input.
- Step 805 indicates the beginning of the method.
- the processor 220 may receive an input from a control device 405 that has been manipulated by a user 430.
- a control device 405 may have a plurality of buttons and/or switches that may be manipulated in a way that allows a user 430 to control what input of the control device 405 the system 400 receives.
- the processor 220 may convert the input into a computer readable signal via a driver during step 815.
- the processor 220 preferably converts the input into a function, which may then be converted by the processor 220 into a digital signal.
- the processor 220 may also convert the function into a virtual reality signal as illustrated in FIG. 9.
- the processor 220 may transmit the computer readable signal to the paired application 415A during step 820.
- the processor 220 may then perform a query during step 822 to determine whether or not there is a virtual embodiment within the virtual reality setting 225.
- step 825 the processor 220 may determine what do based on the query. If the virtual reality setting 425 does not comprise a virtual embodiment of a physical machine, the system 400 may proceed to the terminate method step 830. If the system 400 does comprise a virtual reality setting 425 having a virtual embodiment of a physical machine, the system 400 may proceed to step 827, wherein the processor 220 may transmit the virtual reality signal to virtual reality setting 425. Once the virtual reality signal has been transmitted to the virtual reality setting 425, the system 400 may proceed to the terminate method step 830.
- FIG. 9 provides a flow chart 900 illustrating certain, preferred method steps that may be used to carry out the method of importing a paired application 415A into a virtual reality setting 425.
- Step 905 indicates the beginning of the method.
- a user 430 is provided with the system 400.
- the user 430 is then provided with a paired application 415A in step 915, wherein the paired application 415A is executed on the system 400.
- the paired application 415A is coupled to a physical machine, wherein a virtual embodiment of the physical machine may be manipulated by the paired application 415A within a virtual reality setting 425.
- the paired application 415A is executed within a streaming application 415 of the system 400.
- the user 430 is provided with the control device 405 coupled to a plurality of functions 410 during step 920, wherein manipulation of the control device 405 by the user 430 allows the user 430 to control the paired application 415A.
- the user 430 may then be provided with the virtual reality setting 425 in step 925, wherein the virtual reality setting 425 is rendered by the system 400.
- the virtual reality setting 425 is made specifically for the paired application 415A.
- the system 400 then renders the virtual reality setting 425 and executes the paired application 415A during step 930.
- the system 400 may stream the paired application 415A to the virtual reality setting 425 during step 935.
- the virtual reality setting 425 comprises a virtual screen 425B to which the paired application 415A is streamed. Users 430 within the virtual reality setting 425 may view the data streamed to the virtual screen 425B.
- the system 400 may then receive inputs from the control device 405 during step 940, which may instruct the system 400 how the user 430 wishes to proceed.
- a driver facilitates communication between the control device 405 and the system 400, wherein the inputs of the control device 405 correspond to certain functions within the plurality of functions of the driver.
- the system 400 may transform the functions into a computer readable signal during step 945.
- the computer readable signal may be a digital input or a virtual input.
- the system 400 may then use this computer readable signal to alter the paired application 415A during step 950.
- the system 400 may use the computer readable signal to alter the virtual reality setting 425.
- the user 430 is provided with a display 316 in step 955.
- the display 316 is a virtual reality headset.
- the system 400 may transmit the virtual reality setting 425 comprising a virtual screen 425B streaming the paired application 415A to the display 316 and present it to the user 430 during step 960.
- the method may proceed to the terminate method step 965.
- the subject matter described herein may be embodied in system, apparati, methods, and/or articles depending on the desired configuration.
- various implementations of the subject matter described herein may be realized in digital electronic circuitry, integrated circuitry, specially designed application specific integrated circuits (ASICs), computer hardware, firmware, software, and/or combinations thereof.
- ASICs application specific integrated circuits
- These various implementations may include implementation in one or more computer programs that may be executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, and at least one peripheral device.
- Non-transitory computer-readable medium refers to any computer program, product, apparatus, and/or device, such as magnetic discs, optical disks, memory, and Programmable Logic Devices (PLDs), used to provide machine instructions and/or data to a programmable processor, including a non-transitory computer-readable medium that receives machine instructions as a computer- readable signal.
- PLDs Programmable Logic Devices
- computer-readable signal refers to any signal used to provide machine instructions and/or data to a programmable processor.
- a display device such as a cathode ray tube (CRD), liquid crystal display (LCD), light emitting display (LED) monitor for displaying information to the user and a keyboard and a pointing device, such as a mouse or a trackball, by which the user may provide input to the computer.
- Displays may include, but are not limited to, visual, auditory, cutaneous, kinesthetic, olfactory, and gustatory displays, or any combination thereof.
- feedback provided to the user may be any form of sensory feedback, such as visual feedback, auditory feedback, or tactile feedback; and input from the user may be received in any form including, but not limited to, acoustic, speech, or tactile input.
- the subject matter described herein may be implemented in a computing system that includes a back-end component, such as a data server, or that includes a middleware component, such as an application server, or that includes a front-end component, such as a client computer having a graphical user interface or a Web browser through which a user may interact with the system described herein, or any combination of such back-end, middleware, or front-end components.
- the components of the system may be interconnected by any form or medium of digital data communication, such as a communication network.
- Examples of communication networks may include, but are not limited to, a local area network (“LAN”), a wide area network (“WAN”), metropolitan area networks (“MAN”), and the internet.
- a system for importing a software application into a virtual reality setting comprising: an expansion port,
- a processor operably connected to said control device via said expansion port
- control device operably connected to said expansion port and having a plurality of functions stored thereon
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A system and method for importing a software application into a virtual reality setting is provided. The system generally comprises a computing device, control device operably connected to the computing device, computing entity, and display. The computing device may comprise a processor, expansion port operably connected to the processor, power supply, and a non-transitory computer-readable medium coupled to the processor and having instructions stored thereon. A user may navigate the virtual reality setting using the display. A virtual screen within the virtual reality setting allows the system to stream the software application into the virtual reality setting, and the control device, made specifically for the software application, allows a user to operate the software application separate from the virtual reality setting.
Description
SYSTEM AND METHOD FOR IMPORTING A SOFTWARE APPLICATION INTO A VIRTUAL
REALITY SETTING
FIELD OF THE DISCLOSURE
[0001] The subject matter of the present disclosure refers generally to a system and method for importing a software application into a virtual reality setting.
BACKGROUND
[0002] Virtual reality provides endless opportunities to train professionals in various fields on how to use equipment they may encounter while on the job. A virtual reality environment may be made for multiple scenarios that more realistically simulate situations that professionals may find themselves in while operating in their chosen field. It also allows other professionals to view the training professional’s performance as they attempt to complete the training exercise without spatial proximity becoming an issue. Additionally, by creating a training scenario in a virtual reality setting, the same scenario may be used to train multiple professionals multiple times. This may drastically reduce costs depending on the number of training simulations a group of professionals must undergo since each training scenario might otherwise need to be physically
created. Waste is also greatly reduced for training exercises that require one time use materials. For instance, a medical training institution needing cadavers for each of their students may spend a considerable amount of money, whereas a single virtual reality simulation providing multiple cadavers in various conditions may provide multiple students a greater variety of training at a much lower cost.
[0003] Providing professionals training on the proper use of medical equipment within a virtual reality setting is becoming an increasingly popular and cost-effective method to train. Modern medical equipment is often coupled to a software specially designed to control its various features. Further complicating matters is the fact that control devices designed for the medical equipment aren’t easy to replicate in a virtual environment. Currently software must be recreated within the virtual reality setting, and virtual reality game controls must be used in place of the control device. The recreation of the software within the virtual reality setting is both time consuming and expensive. Additionally, recreating the software within the virtual reality setting is not always an exact replica of the software a professional may use in a real-life situation. This means that not only do current virtual reality settings designed to train medical professionals have software different than that of what a medical professional may use in real-life situations but training medical professionals in current virtual reality settings are not even training on the same control devices they would use in a real-life scenario. This has obviously slowed the transition to virtual reality for training purposes and is causing money to be spent on different types of training that might be better put to use in other areas, such as research and development.
[0004] Accordingly, there is a need in the art for a system and method for streaming software into a virtual environment while simultaneously allowing a user to use the same controls they would use in a real-life situation.
SUMMARY
[0005] A system and method for importing software into a virtual reality setting is provided. The system generally comprises a computing device, control device operably connected to the computing device, computing entity, and display. In a preferred embodiment, the computing device comprises a processor, expansion port operably connected to the processor, power supply, and a non-transitory computer-readable medium coupled to the processor and having instructions stored thereon. The control device may be operably connected to the processor via the expansion port. In another preferred embodiment, the control device may be operably connected to the processor wirelessly via a communication interface of the computing entity. In one preferred embodiment, a database may be operably connected to the processor in a way such that it may store data created when the system is operated by a user. It is understood that the various method steps associated with the methods of the present disclosure may be carried out as operations by the system.
[0006] A plurality of functions is coupled to the control device in a way that may cause a corresponding paired application and/or physical machine to perform an action. A user may activate the plurality of functions of the control device by manipulating the control device in way such that it sends an input to the system. The control device may transmit the input to the processor via the expansion port. The processor may then transform the functions into a digital input and/or a virtual reality input. The plurality of functions of the control device may be in the form of a driver. The driver facilitates communicate between the system and the control device, wherein the processor carries out the processes of the system and the plurality of functions of the driver. When the user manipulates the control device in a way that provides the system with an input via the
expansion port, the processor may pair the input with a certain function of the driver. Once the processor pairs input from the control device with a function of the driver, the processor may relay the function to the paired application, which is a software application that has been created to work with a physical machine and/or control device.
[0007] A user may navigate the virtual reality setting using a display and the control device. A virtual reality setting may comprise a virtual reality environment, a virtual screen, and a plurality of virtual reality objects. The virtual reality environment defines the boundaries of the virtual reality setting. A user may navigate a virtual reality setting by changing their virtual position within the virtual reality setting. The virtual reality environment may be populated by the plurality of virtual reality objects. The plurality of virtual reality objects may provide the virtual reality setting with characteristics of the physical setting in which the virtual reality setting is meant to emulate. The plurality of virtual reality objects may be manipulated by a user as the user interacts with the virtual reality setting. A user may interact with the virtual reality objects using a control device or some other form of equipment designed to allow a user to interact with the virtual reality setting. A virtual screen allows for data not part of the virtual reality setting to be streamed into the virtual reality setting. A virtual window may be coupled to a streaming application in a way such that data viewed within the streaming application may be viewed within the virtual screen. In some embodiments, a virtual reality setting may have multiple virtual screens that may stream data from multiple streaming applications.
[0008] A computing entity operably connected to the system may renders the virtual reality setting. The computing entity that renders the virtual reality setting preferably comprises a display, secondary processor operably connected to the display, a secondary power supply, and a secondary non-transitory computer-readable medium operably connected to the processor and having
instructions stored thereon. The secondary processor of the computing entity preferably receives the virtual reality setting from the non-transitory computer-readable medium. Once received, the secondary processor may render the virtual reality setting and transmit the virtual reality setting to the display. The secondary processor may receive a stream from streaming application of the system and transmit that stream to the virtual screen within the virtual reality setting. A viewer may then view both the virtual reality setting and the stream within the virtual screen.
[0009] The foregoing summary has outlined some features of the system and method of the present disclosure so that those skilled in the pertinent art may better understand the detailed description that follows. Additional features that form the subject of the claims will be described hereinafter. Those skilled in the pertinent art should appreciate that they can readily utilize these features for designing or modifying other structures for carrying out the same purpose of the system and method disclosed herein. Those skilled in the pertinent art should also realize that such equivalent designs or modifications do not depart from the scope of the system and method of the present disclosure.
DESCRIPTON OF THE DRAWINGS
[00010] These and other features, aspects, and advantages of the present disclosure will become better understood with regard to the following description, appended claims, and accompanying drawings where:
FIG. 1 is a diagram of an example environment in which techniques described herein may be implemented.
FIG. 2 is a diagram of an example environment in which techniques described herein may be implemented.
FIG. 3 is a diagram of an example environment in which techniques described herein may be implemented.
FIG. 4 is a diagram of an example environment in which techniques described herein may be implemented.
FIG. 5 is an example virtual reality setting in which software may be integrated.
FIG. 6 is a diagram illustrating the manner in which individual access to data may be granted or limited based on user or system roles.
FIG. 7 is a flow chart illustrating certain method steps of a method embodying features consistent with the principles of the present disclosure.
FIG. 8 is a flow chart illustrating certain method steps of a method embodying features consistent with the principles of the present disclosure.
FIG. 9 is a flow chart illustrating certain method steps of a method embodying features consistent with the principles of the present disclosure.
DETAILED DESCRIPTION
[00011] In the Summary above and in this Detailed Description, and the claims below, and in the accompanying drawings, reference is made to particular features, including method steps, of the invention. It is to be understood that the disclosure of the invention in this specification includes all possible combinations of such particular features. For instance, where a particular feature is disclosed in the context of a particular aspect or embodiment of the invention, or a particular claim, that feature can also be used, to the extent possible, in combination with/or in the context of other particular aspects of the embodiments of the invention, and in the invention generally.
[00012] The term“comprises” and grammatical equivalents thereof are used herein to mean that other components, steps, etc. are optionally present. For instance, a system 400“comprising” components A, B, and C can contain only components A, B, and C, or can contain not only components A, B, and C, but also one or more other components. Where reference is made herein to a method comprising two or more defined steps, the defined steps can be carried out in any order or simultaneously (except where the context excludes that possibility), and the method can include one or more other steps which are carried out before any of the defined steps, between two of the defined steps, or after all the defined steps (except where the context excludes that possibility). As used herein, a spatial position 470 may be defined a user’s 430 position within the physical world, whereas a virtual position 480 may be defined as a user’s 430 position with a virtual world.
[00013] As will be evident from the disclosure provided below, the present invention satisfies the need for a system 400 and method capable of importing a software application into a virtual reality setting 425 while simultaneously allowing a user 430 to use the control device 405 they might use in a real-life situation, and thereby improving upon known systems currently employed within the art.
[00014] FIG. 1 depicts an exemplary environment 100 of the system 400 consisting of clients 105 connected to a server 110 and/or database 115 via a network 150 Clients 105 are devices of users 430 that may be used to access servers 110 and/or databases 115 through a network 150 A network 150 may comprise of one or more networks of any kind, including, but not limited to, a local area network (LAN), a wide area network (WAN), metropolitan area networks (MAN), a telephone network, such as the Public Switched Telephone Network (PSTN), an intranet, the Internet, a memory device, another type of network, or a combination of networks. In a preferred embodiment, computing entities 200 may act as clients 105 for a user 430 For instance, a client
105 may include a personal computer, a wireless telephone, a personal digital assistant (PDA), a laptop, a smart phone, a tablet computer, or another type of computation or communication device. Servers 110 may include devices that access, fetch, aggregate, process, search, provide, and/or maintain documents. Although FIG. 1 depicts a preferred embodiment of an environment 100 for the system 400, in other implementations, the environment 100 may contain fewer components, different components, differently arranged components, and/or additional components than those depicted in FIG. 1. Alternatively, or additionally, one or more components of the environment 100 may perform one or more other tasks described as being performed by one or more other components of the environment 100.
[00015] As depicted in FIG. 1, one embodiment of the system 400 may comprise a server 110.
Although shown as a single server 110 in FIG. 1, a server 110 may, in some implementations, be implemented as multiple devices interlinked together via the network 150, wherein the devices may be distributed over a large geographic area and performing different functions or similar functions. For instance, two or more servers 110 may be implemented to work as a single server 110 performing the same tasks. Alternatively, one server 110 may perform the functions of multiple servers 110. For instance, a single server 110 may perform the tasks of a web server 110 and an indexing server 110. Additionally, it is understood that multiple servers 110 may be used to operably connect the processor 220 to the database 115 and/or other content repositories. The processor 220 may be operably connected to the server 110 via wired or wireless connection. Types of servers 110 that may be used by the system 400 include, but are not limited to, search servers 110, document indexing servers 110, and web servers 110, or any combination thereof.
[00016] Search servers 110 may include one or more computing entities 200 designed to implement a search engine, such as a documents/records search engine, general webpage search engine, etc.
Search servers 110 may, for example, include one or more web servers 110 designed to receive search queries and/or inputs from users 430, search one or more databases 115 in response to the search queries and/or inputs, and provide documents or information, relevant to the search queries and/or inputs, to users 430. In some implementations, search servers 110 may include a web search server 110 that may provide webpages to users 430, wherein a provided webpage may include a reference to a web server 110 at which the desired information and/or links are located. The references to the web server 110 at which the desired information is located may be included in a frame and/or text box, or as a link to the desired information/document. Document indexing servers 110 may include one or more devices designed to index documents available through networks 150. Document indexing servers 110 may access other servers 110, such as web servers 110 that host content, to index the content. In some implementations, document indexing servers 110 may index documents/records stored by other servers 110 connected to the network 150. Document indexing servers 110 may, for example, store and index content, information, and documents relating to user accounts and user-generated content. Web servers 110 may include servers 110 that provide webpages to clients 105. For instance, the webpages may be HTML- based webpages. A web server 110 may host one or more websites. As used herein, a website may refer to a collection of related webpages. Frequently, a website may be associated with a single domain name, although some websites may potentially encompass more than one domain name. The concepts described herein may be applied on a per-website basis. Alternatively, in some implementations, the concepts described herein may be applied on a per-webpage basis.
[00017] As used herein, a database 115 refers to a set of related data and the way it is organized.
Access to this data is usually provided by a database 115 management system 400 (DBMS) consisting of an integrated set of computer software that allows users 430 to interact with one or
more databases 115 and provides access to all of the data contained in the database. The DBMS provides various functions that allow entry, storage and retrieval of large quantities of information and provides ways to manage how that information is organized. Because of the close relationship between the database 115 and the DBMS, as used herein, the term database 115 refers to both a database 115 and DBMS.
[00018] FIG. 2 is an exemplary diagram of a client, server, and/or or database 115 (hereinafter called“computing entity”), which may correspond to one or more of the clients 105, servers 110, and databases 115 according to an implementation consistent with the principles of the invention as described herein. The computing entity 200 may comprise bus 210, a processor 220, memory 304, a storage device 250, a peripheral device 270, and a communication interface 280. The bus 210 may be defined as one or more conductors that permit communication among the components of the computing entity 200. The processor 220 may be defined as a logic circuitry that responds to and processes the basic instructions that drive the computing entity 200. Memory 304 may be defined as the integrated circuitry that stores information for immediate use in a computing entity 200. A peripheral device 270 may be defined as any hardware used by a user 430 and/or the computing entity 200 to facilitate communicate between the two. A storage device 250 may be defined as a device used to provide mass storage to a computing entity 200. A communication interface 280 may be defined as any transceiver-like device that enables the computing entity 200 to communicate with other devices and/or computing entities 200.
[00019] The bus 210 may comprise a high-speed interface 308 and/or a low-speed interface 312 that connects the various components together in a way such they may communicate with one another. A high-speed interface 308 manages bandwidth-intensive operations for computing device 300, while a low-speed interface 312 manages lower bandwidth-intensive operations. In
some preferred embodiments, the high-speed interface 308 of a bus 210 may be coupled to the memory 304, display 316, and to high-speed expansion ports 310, which may accept various expansion cards such as a graphics processing unit (GPU). In other preferred embodiments, the low-speed interface 312 of a bus 210 may be coupled to a storage device 250 and low-speed expansion ports 314. The low-speed expansion ports 314 may include various communication ports, such as USB, Bluetooth, Ethernet, wireless Ethernet, etc. Additionally, the low-speed expansion ports 314 may be coupled to one or more peripheral devices 270, such as a keyboard, pointing device, scanner, and/or a networking device, wherein the low-speed expansion ports 314 facilitate the transfer of input data from the peripheral devices 270 to the processor 220 via the low-speed interface 312.
[00020] The processor 220 may comprise any type of conventional processor 220 or microprocessor 220 that interprets and executes computer readable instructions. The processor 220 is configured to perform the operations disclosed herein based on instructions stored within the system 400. The processor 220 may process instructions for execution within the computing entity 200, including instructions stored in memory 304 or on a storage device 250, to display graphical information for a graphical user interface (GUI) on an external peripheral device 270, such as a display 316. The processor 220 may provide for coordination of the other components of a computing entity 200, such as control of user interfaces, applications run by a computing entity 200, and wireless communication by a communication device of the computing entity 200. The processor 220 may be any processor 220 or microprocessor 220 suitable for executing instructions. In some embodiments, the processor 220 may have a memory device therein or coupled thereto suitable for storing the data, content, or other information or material disclosed herein. In some instances, the processor 220 may be a component of a larger computing entity 200. A computing
entity 200 that may house the processor 220 therein may include, but are not limited to, laptops, desktops, workstations, personal digital assistants, servers 110, mainframes, cellular telephones, tablet computers, or any other similar device. Accordingly, the inventive subject matter disclosed herein, in full or in part, may be implemented or utilized in devices including, but are not limited to, laptops, desktops, workstations, personal digital assistants, servers 110, mainframes, cellular telephones, tablet computers, or any other similar device.
[00021] Memory 304 stores information within computing device 300 In some preferred embodiments, memory 304 may include one or more volatile memory units. In another preferred embodiment, memory 304 may include one or more non-volatile memory units. Memory 304 may also include another form of computer-readable medium, such as a magnetic or optical disk. For instance, a portion of a magnetic hard drive may be partitioned as a dynamic scratch space to allow for temporary storage of information that may be used by the processor 220 when faster types of memory, such as random-access memory (RAM), are in high demand. A computer-readable medium may refer to a non-transitory computer-readable memory device. A memory device may refer to storage space within a single storage device 250 or spread across multiple storage devices 250 The memory 304 may comprise main memory 230 and/or read only memory (ROM) 240 In a preferred embodiment, the main memory 230 may comprise RAM or another type of dynamic storage device 250 that stores information and instructions for execution by the processor 220 ROM 240 may comprise a conventional ROM device or another type of static storage device 250 that stores static information and instructions for use by processor 220 The storage device 250 may comprise a magnetic and/or optical recording medium and its corresponding drive.
[00022] As mentioned earlier, a peripheral device 270 is a device that facilitates communication between a user 430 and the processor 220 The peripheral device 270 may include, but is not
limited to, an input device and/or an output device. As used herein, an input device may be defined as a device that allows a user 430 to input data and instructions that is then converted into a pattern of electrical signals in binary code that are comprehensible to a computing entity 200. An input device of the peripheral device 270 may include one or more conventional devices that permit a user 430 to input information into the computing entity 200, such as a scanner, phone, camera, scanning device, keyboard, a mouse, a pen, voice recognition and/or biometric mechanisms, etc. As used herein, an output device may be defined as a device that translates the electronic signals received from a computing entity 200 into a form intelligible to the user 430. An output device of the peripheral device 270 may include one or more conventional devices that output information to a user 430, including a display 316, a printer, a speaker, an alarm, a projector, etc. Additionally, storage devices 250, such as CD-ROM drives, and other computing entities 200 may act as a peripheral device 270 that may act independently from the operably connected computing entity 200. For instance, a fitness tracker may transfer data to a smartphone, wherein the smartphone may use that data in a manner separate from the fitness tracker.
[00023] The storage device 250 is capable of providing the computing entity 200 mass storage. In some embodiments, the storage device 250 may comprise a computer-readable medium such as the memory 304, storage device 250, or a memory 304 on processor 220. A computer-readable medium may be defined as one or more physical or logical memory devices and/or carrier waves. Devices that may act as a computer readable medium include, but are not limited to, a hard disk device, optical disk device, tape device, flash memory or other similar solid-state memory device, or an array of devices, including devices in a storage area network or other configurations. Examples of computer-readable mediums include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM discs and DVDs;
magneto-optical media such as optical discs; and hardware devices that are specially configured to store and perform programming instructions, such as ROM 240, RAM, flash memory, and the like.
[00024] In an embodiment, a computer program may be tangibly embodied in the storage device
250. The computer program may contain instructions that, when executed by the processor 220, performs one or more steps that comprise a method, such as those methods described herein. The instructions within a computer program may be carried to the processor 220 via the bus 210. Alternatively, the computer program may be carried to a computer-readable medium, wherein the information may then be accessed from the computer-readable medium by the processor 220 via the bus 210 as needed. In a preferred embodiment, the software instructions may be read into memory 304 from another computer-readable medium, such as data storage device 250, or from another device via the communication interface 280. Alternatively, hardwired circuitry may be used in place of or in combination with software instructions to implement processes consistent with the principles as described herein. Thus, implementations consistent with the invention as described herein are not limited to any specific combination of hardware circuitry and software.
[00025] FIG. 3 depicts exemplary computing entities 200 in the form of a computing device 300 and mobile computing device 350, which may be used to carry out the various embodiments of the invention as described herein. A computing device 300 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers 110, databases 115, mainframes, and other appropriate computers. A mobile computing device 350 is intended to represent various forms of mobile devices, such as scanners, scanning devices, personal digital assistants, cellular telephones, smart phones, tablet computers, and other similar devices. The various components depicted in FIG. 3, as well as their connections, relationships,
and functions are meant to be examples only, and are not meant to limit the implementations of the invention as described herein. The computing device 300 may be implemented in a number of different forms, as shown in FIGS. 1 and 3. For instance, a computing device 300 may be implemented as a server 110 or in a group of servers 110. Computing devices 300 may also be implemented as part of a rack server 110 system 400. In addition, a computing device 300 may be implemented as a personal computer, such as a desktop computer or laptop computer. Alternatively, components from a computing device 300 may be combined with other components in a mobile device, thus creating a mobile computing device 350. Each mobile computing device 350 may contain one or more computing devices 300 and mobile devices, and an entire system may be made up of multiple computing devices 300 and mobile devices communicating with each other as depicted by the mobile computing device 350 in FIG. 3. The computing entities 200 consistent with the principles of the invention as disclosed herein may perform certain receiving, communicating, generating, output providing, correlating, and storing operations as needed to perform the various methods as described in greater detail below.
[00026] In the embodiment depicted in FIG. 3, a computing device 300 may include a processor 220, memory 304, a storage device 250, high-speed expansions ports 310, low-speed expansion ports 314, and bus 210 operably connecting the processor 220, memory 304, storage device 250, high-speed expansions ports 310, and low-speed expansion ports 314. In one preferred embodiment, the bus 210 may comprise a high-speed interface 308 connecting the processor 220 to the memory 304 and high-speed expansion ports 310 as well as a low-speed interface 312 connecting to the low-speed expansion ports 314 and the storage device 250. Because each of the components are interconnected using the bus 210, they may be mounted on a common motherboard as depicted in FIG. 3 or in other manners as appropriate. The processor 220 may
process instructions for execution within the computing device 300, including instructions stored in memory 304 or on the storage device 250. Processing these instructions may cause the computing device 300 to display graphical information for a graphical user interface (GUI) on an output device, such as a display 316 coupled to the high-speed interface 308. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memory units and/or multiple types of memory. Additionally, multiple computing devices may be connected, wherein each device provides portions of the necessary operations.
[00027] A mobile computing device 350 may include a processor 220, memory 304, a peripheral device 270 (such as a display 316), a communication interface 280, and a transceiver 368, among other components. A mobile computing device 350 may also be provided with a storage device 250, such as a micro-drive or other previously mentioned storage device 250, to provide additional storage. Preferably, each of the components of the mobile computing device 350 are interconnected using a bus 210, which may allow several of the components of the mobile computing device 350 to be mounted on a common motherboard as depicted in FIG. 3 or in other manners as appropriate. In some implementations, a computer program may be tangibly embodied in an information carrier. The computer program may contain instructions that, when executed by the processor 220, perform one or more methods, such as those described herein. The information carrier is preferably a computer- readable medium, such as memory, expansion memory 374, or memory 304 on the processor 220 such as ROM 240, that may be received via the transceiver or external interface 362. The mobile computing device 350 may be implemented in a number of different forms, as shown in FIG. 3. For example, a mobile computing device 350 may be implemented as a cellular telephone, part of a smart phone, personal digital assistant, or other similar mobile device.
[00028] The processor 220 may execute instructions within the mobile computing device 350, including instructions stored in the memory 304 and/or storage device 250. The processor 220 may be implemented as a chipset of chips that may include separate and multiple analog and/or digital processors. The processor 220 may provide for coordination of the other components of the mobile computing device 350, such as control of the user interfaces, applications run by the mobile computing device 350, and wireless communication by the mobile computing device 350. The processor 220 of the mobile computing device 350 may communicate with a user 430 through the control interface 358 coupled to a peripheral device 270 and the display interface 356 coupled to a display 316. The display 316 of the mobile computing device 350 may include, but is not limited to, Liquid Crystal Display (LCD), Light Emitting Diode (LED) display, Organic Light Emitting Diode (OLED) display, and Plasma Display Panel (PDP), or any combination thereof. The display interface 356 may include appropriate circuitry for causing the display 316 to present graphical and other information to a user 430. The control interface 358 may receive commands from a user 430 via a peripheral device 270 and convert the commands into a computer readable signal for the processor 220. In addition, an external interface 362 may be provided in communication with processor 220, which may enable near area communication of the mobile computing device 350 with other devices. The external interface 362 may provide for wired communications in some implementations or wireless communication in other implementations. In a preferred embodiment, multiple interfaces may be used in a single mobile computing device 350 as is depicted in FIG. 3.
[00029] Memory 304 stores information within the mobile computing device 350. Devices that may act as memory 304 for the mobile computing device 350 include, but are not limited to computer-readable media, volatile memory, and non-volatile memory. Expansion memory 374
may also be provided and connected to the mobile computing device 350 through an expansion interface 372, which may include a Single In-Line Memory Module (SIM) card interface or micro secure digital (Micro-SD) card interface. Expansion memory 374 may include, but is not limited to, various types of flash memory and non-volatile random-access memory (NVRAM). Such expansion memory 374 may provide extra storage space for the mobile computing device 350. In addition, expansion memory 374 may store computer programs or other information that may be used by the mobile computing device 350. For instance, expansion memory 374 may have instructions stored thereon that, when carried out by the processor 220, cause the mobile computing device 350 perform the methods described herein. Further, expansion memory 374 may have secure information stored thereon; therefore, expansion memory 374 may be provided as a security module for a mobile computing device 350, wherein the security module may be programmed with instructions that permit secure use of a mobile computing device 350. In addition, expansion memory 374 having secure applications and secure information stored thereon may allow a user 430 to place identifying information on the expansion memory 374 via the mobile computing device 350 in a non-hackable manner.
[00030] A mobile computing device 350 may communicate wirelessly through the communication interface 280, which may include digital signal processing circuitry where necessary. The communication interface 280 may provide for communications under various modes or protocols, including, but not limited to, Global System Mobile Communication (GSM), Short Message Services (SMS), Enterprise Messaging System (EMS), Multimedia Messaging Service (MMS), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Personal Digital Cellular (PDC), Wideband Code Division Multiple Access (WCDMA), IMT Multi-Carrier (CDMA2000) , and General Packet Radio Service (GPRS), or any combination thereof. Such
communication may occur, for example, through a transceiver 368. Short-range communication may occur, such as using a Bluetooth, WIFI, or other such transceiver 368. In addition, a Global Positioning System (GPS) receiver module 370 may provide additional navigation-and location- related wireless data to the mobile computing device 350, which may be used as appropriate by applications running on the mobile computing device 350. Alternatively, the mobile computing device 350 may communicate audibly using an audio codec 360, which may receive spoken information from a user 430 and covert the received spoken information into a digital form that may be processed by the processor 220. The audio codec 360 may likewise generate audible sound for a user 430, such as through a speaker, e.g., in a handset of mobile computing device 350. Such sound may include sound from voice telephone calls, recorded sound such as voice messages, music files, etc. Sound may also include sound generated by applications operating on the mobile computing device 350.
[00031] FIGS. 4-9 illustrate embodiments of a system 400 for importing a paired application 415A into a virtual reality setting 425. FIG. 4 shows an embodiment of the disclosed system 400. The system 400 generally comprises a computing device 300, control device 405 operably connected to the computing device 300, computing entity 200, and display 316. In a preferred embodiment, the computing device 300 comprises a processor 220, expansion port 310, 314 operably connected to the processor 220, power supply, and a non-transitory computer-readable medium 402 coupled to the processor 220 and having instructions stored thereon. The control device 405 may be operably connected to the processor 220 via the expansion port 310, 314. In another preferred embodiment, the control device 405 may be operably connected to the processor 220 wirelessly via a communication interface 280 of the computing entity 200. In one preferred embodiment, a database 115 may be operably connected to the processor 220 in a way such that it may store data
created when the system 400 is operated by a user 430. It is understood that the various method steps associated with the methods of the present disclosure may be carried out as operations by the system 400 shown in FIG. 4. FIGS. 7-9 illustrate various methods that may be carried out by the system 400. FIG. 6 illustrates permission levels 600 that may be utilized by the system 400 for controlling access to user content 315, 335, 355. FIG. 5 illustrates an example virtual reality setting 425 that may be presented via a display 316 such as a virtual reality headset.
[00032] A control device 405 is defined as a device used to control a physical machine and/or its corresponding paired application 415A. In a preferred embodiment, a plurality of functions 410 is coupled to the control device 405 in a way that may cause a corresponding paired application 415A and/or physical machine to perform an action. A user 430 may activate the plurality of functions 410 of the control device 405 by manipulating the control device 405 in way such that it sends an input to the system 400. For instance, a user 430 may activate a switch of a control device 405 operably connected to a crane that causes an input to be sent to the system 400, which in turn causes the crane to elevate a hoist. For instance, a user 430 may toggle the directional controls of a surgical robot in a way that causes an input to be sent to the system 400, which in turn causes the surgical robot to move its robotic appendages. For instance, a user 430 may press a touchscreen of a control device 405 operably connected to a light system 400 in a way that causes an input to be sent to the system 400, which in turn causes the light system 400 to darken. The control device 405 may transmit the input to the processor 220 via the expansion port 310, 314. The processor 220 may then transform the function into a digital input and/or a virtual reality input. A digital input may be defined as a computer readable signal used by the system 400 to manipulate the paired application 415A. A virtual reality input may be defined as a computer readable signal used by the system 400 to manipulate the virtual reality setting 425. In some embodiments, the system
400 may use the same functions within the plurality of functions 410 of a control device 405 to manipulate both the paired application 415A and virtual reality setting 425. In a preferred embodiment, the plurality of functions 410 of the control device 405 may be in the form of a driver.
[00033] A driver may be defined as a software component that allows the control device 405 to communicate with the rest of the system 400. For instance, a paired application 415A requiring input from a control device 405 may call a process implemented by the system 400, which may call for a function implemented by a driver of control device 405. The driver facilitates communicate between the system 400 and the control device 405, wherein the processor 220 carries out the processes of the system 400 and the plurality of functions 410 of the driver. When the user 430 manipulates the control device 405 in a way that provides the system 400 with an input via the expansion port 310, 314, the processor 220 may pair the input with a certain function of the driver. Once the processor 220 pairs input from the control device 405 with a function of the driver, the processor 220 may relay the function to the application. In an embodiment, a driver may be used to transform the input from the control device 405 into a digital input and/or virtual reality input. In another embodiment, more than one driver may be used to communicate with a control device 405. These drivers may be layered in a driver stack, wherein each driver within the driver stack may perform a certain task. For instance, a stack having three drivers may have a first driver that communicates with the operating system 400 and a second driver that communicates with the control device 405. The third driver may act as an intermediary between the first driver and second driver, wherein the third driver receives the plurality of functions 410 from the second driver and converts them into a digital input before relaying the digital input to the first driver. The first driver may then relay the digital input to the operating system 400, which may then relay the digital input to the paired application 415A.
[00034] A paired application 415A may be defined as a software application that has been created to work with a physical machine and/or control device 405. A physical machine may be defined as a mechanical device that a user 430 may operate to accomplish a specific task. For instance, a vascular robot designed to assist medical professionals implant a stent within a patient may be coupled with a paired application 415A that allows a user 430 to control the vascular robot. For instance, farm equipment designed to assist workers tend to crops may be coupled with a paired application 415A that allows a user 430 to control various functions of the farm equipment. For instance, a mobile computing device 350 may be coupled with a paired application 415A that allows a user 430 to perform various tasks. In one preferred embodiment, the control device 405 may control various functions of the physical machine without the assistance of the paired application. In other embodiments, the paired application may control every aspect of the physical machine. Control of the physical machine may be accomplished by using the software via manipulation of the control device 405. In an embodiment, physical machines may have a virtual embodiment within a virtual reality setting 425. The virtual embodiment of the virtual reality setting 425 may be manipulated by a control device 405 operably connected to the system 400, wherein the plurality of functions 410 of the control device 405 may be transformed into virtual reality inputs that alter the virtual reality setting 425.
[00035] A virtual reality setting 425 is defined as an artificial, interactive, computer-created scene or 'world' within which a user 430 may immerse themselves. The user 430 may navigate the virtual reality setting 425 using the display 316 and the control device 405. In an embodiment, a virtual reality setting 425 may comprise a virtual reality environment 425 A, a virtual screen 425B, and a plurality of virtual reality objects 425C. The virtual reality environment 425A defines the boundaries of the virtual reality setting 425. A virtual reality setting 425 has no physical boundaries
and must therefore have virtual boundaries. In an embodiment, the size of a virtual reality environment 425A is relative to that of a physical environment, wherein the physical environment may be defined by a plurality of spatial positions 470 and the virtual reality environment 425A may be defined by a plurality of virtual position 480. For instance, a virtual reality setting 425 comprising a surgical room may have a square boundary that appears to be similar to that in size of a physical version of a surgical room from the point of view of the user 430 as viewed through the display 316. In some embodiments of a virtual reality setting 425, a user 430 may not be able to view the boundaries of the environment. For instance, a user 430 having a virtual position 480 in the center of a virtual reality setting 425 comprising one square mile of densely forested land, may not see the boundaries of the virtual reality setting 425, thus the virtual reality setting 425 may appear boundaryless. A user 430 may navigate a virtual reality setting 425 by changing their virtual position 480 within the virtual reality setting 425. In an embodiment, the system 400 may alter the virtual position 480 of a user 430 by measuring changes in a user’s 430 spatial position 470 and converting those changes into a changes in the user’s 430 virtual position 480. For instance, a user 430 wearing a virtual reality headset measuring the spatial position 470 of the user 430 may transmit the spatial position 470 the system 400, wherein the system 400 may convert the changed spatial position 470 into a change in the user’s 430 virtual position 480. In another embodiment, a user 430 may use a control device 405 to alter their virtual position 480 within a virtual reality setting 425.
[00036] The virtual reality environment 425A may be populated by the plurality of virtual reality objects 425C. The plurality of virtual reality objects 425C may provide the virtual reality setting 425 with characteristics of the physical setting in which the virtual reality setting 425 is meant to emulate. For instance, a virtual reality setting 425 created to emulate a surgical room may be
populated with a floor, walls, ceiling, examination table, and a viewing window. For instance, a virtual reality setting 425 created to emulate a construction site may be populated with a fence, dirt, and a bulldozer. For instance, a virtual reality setting 425 created to emulate a beach resort may have sand, beach chairs, volleyball net, and an ocean. In an embodiment, the plurality of virtual reality objects 425C may comprise a virtual embodiment of a physical machine. The plurality of virtual reality objects 425C may be manipulated by a user 430 as the user 430 interacts with the virtual reality setting 425. A user 430 may interact with the virtual reality objects 425C using a control device 405 or some other form of equipment designed to allow a user 430 to interact with the virtual reality setting 425. For instance, a user 430 may wear haptic gloves that allow a user 430 to interact with objects within a virtual reality setting 425. For instance, wearing a virtual reality headset measuring the spatial position 470 may interact with the environment by changing their virtual position 480 within the virtual reality setting 425. In embodiments of a virtual reality setting 425 having a virtual embodiment of a physical machine, the user 430 may interact with the virtual embodiment in the virtual reality setting 425 as they would be able to interact with a physical machine in a physical setting.
[00037] In another embodiment, the plurality of virtual reality obj ects 425C that populate the virtual reality environment 425A may comprise a virtual reality display. The virtual reality display may be coupled to a virtual screen 425B in a way such that manipulating the virtual reality display in a way that changes its virtual position 480 within the virtual reality setting 425 may also manipulate the virtual position 480 of the virtual screen 425B. The virtual reality display may be defined as a virtual reality object 425C within a virtual reality setting 425 that is coupled to the virtual screen 425B. A virtual screen 425B may be defined as a virtual window that allows for data not part of the virtual reality setting 425 to be streamed into the virtual reality setting 425. A virtual window
may be coupled to a streaming application 415 in a way such that data viewed within the streaming application 415 may be viewed within the virtual screen 425B. For instance, a streaming application 415 designed to execute a paired application 415A may be operably connected to a virtual screen 425B within a virtual reality setting 425. A user 430 within the virtual reality setting 425 may view the paired application 415A as it is being executed within streaming application 415 even though the paired application 415A is not part of the virtual reality setting 425. In an embodiment, the data streamed by the streaming application 415 to a virtual screen 425B in a virtual reality setting 425 is video data, wherein a user 430 within the virtual reality setting 425 may view the video data within the virtual screen 425B.
[00038] In some embodiments, a virtual reality setting 425 may have multiple virtual screens 425B that may stream data from multiple streaming applications 415. For instance, a virtual reality setting 425 comprising a sports bar may have multiple virtual screens 425B coupled to multiple virtual reality displays that are designed to look like flat screen televisions. Each virtual screen 425B within the plurality of virtual screens 425B may stream a different sports related event depending on the data streamed from the streaming applications 415 operably connected to the plurality of virtual screens 425B. Additionally, a user 430 may choose which virtual screen 425B amongst the plurality of virtual screens 425B to watch by altering their virtual position 480 within the virtual reality setting 425. The streaming application 415 operably connected to the virtual screen 425B of a virtual reality setting 425 may be executed by the system 400 or by a computing entity 200 operably connected to the system 400.
[00039] In an embodiment, a computing entity 200 operably connected to the system 400 renders the virtual reality setting 425. The computing entity 200 that renders the virtual reality setting 425 preferably comprises a display 316, secondary processor 220 operably connected to the display
316, a secondary power supply, and a secondary non-transitory computer-readable medium 402 having instructions stored therein. The secondary processor 220 of the computing entity 200 preferably receives the virtual reality setting 425 from the non-transitory computer-readable medium 402. Once received, the secondary processor 220 may render the virtual reality setting 425 and transmit the virtual reality setting 425 to the display 316. In a preferred embodiment, the display 316 is a virtual reality headset as depicted in FIG. 4. The virtual reality headset may present the virtual reality setting 425 to the user 430. The secondary processor 220 may receive a stream from system 400 and transmit that stream to the virtual screen 425B of the virtual reality setting 425. A viewer may then view both the virtual reality setting 425 and the stream within the virtual screen 425B. In an embodiment, the virtual screen 425B may stream the paired application 415A running within the streaming application 415 of the system 400.
[00040] In some embodiments, the virtual reality setting 425 may present a user 430 with a virtual scenario. A virtual scenario may be defined as a situation in which a user 430 must perform a series of actions to complete a task presented by the virtual reality setting 425. For instance, a user 430 may be presented a virtual scenario in which they are asked to perform a surgical procedure using a surgical robot. For instance, a user 430 may be may be presented a virtual scenario in which they are asked to operate computerized farm equipment in a way such that it plows a field in a desired pattern within the virtual reality setting 425. Information streamed into the virtual screen 425B may assist the user 430 to complete the virtual scenario. For instance, a user 430 within a virtual scenario involving a surgical procedure may view a virtual screen 425B of an instructor giving a step by step instruction on how to properly perform the surgical procedure. In an embodiment, the virtual scenario may be designed for a particular paired application 415A. For instance, a virtual scenario designed to train a pilot to fly an Airbus 210 A380 may comprise a scenario inside the
cockpit of an Airbus 210 380, wherein the virtual embodiment of the cockpit may have a plurality of virtual screens 425B streaming the paired applications 415A of the Airbus 210 A380.
[00041] In an embodiment, actions taken by the user 430 within the virtual reality setting 425 may be recorded by the computing entity 200. In other embodiments, results of the virtual scenario may be recorded to allow other users 430 to analyze progress made via training. For instance, a virtual scenario may grade the performance of a user 430 by comparing a user’s 430 actions with the predetermined desired actions of the virtual scenario. An instructor may then use these results to provide further instruction to a user 430 when needed. In yet another embodiment, the system 400 may record the user 430 within the virtual scenario. In one preferred embodiment, the recording may be from the user’s 430 point of view within the virtual reality setting 425 as seen through the display 316. In yet another preferred embodiment, the recording may be of the virtual reality setting 425 itself, wherein every aspect of the virtual reality setting 425 is recorded as the user 430 interacts with the virtual scenario so that an instructor may view a user’s 430 actions within the virtual scenario from any perspective within the virtual reality environment 425A. For instance, a recording of a virtual scenario involving combat training may allow an instructor to view the actions of the user 430 from various virtual positions 480 within the virtual reality setting 425. The recorded virtual scenario results of a user 430 may be stored within the computing entity 200. In an embodiment, the training results may be stored within a user profile 420 on a storage device 250.
[00042] A user profile 420 may be defined as a digital folder that contains information pertaining to a particular user 430. For instance, a user profile 420 may contain personal information such as name, date of birth, photograph, etc. For instance, a user profile 420 may contain information pertaining to training scenarios they have undertaken. The information within a user profile 420
may be organized using various files and folders. For instance, personal information may be stored in a separate folder from information regarding training scenarios. In an embodiment, the system 400 may store the information within a non-transitory computer-readable medium 402 of the computing entity 200. In another embodiment, the information may be stored on a database 115 operably connected to the processor 220, wherein access to the database 115 may be limited based on various permission levels 600. In yet another embodiment, a server 110 may be operably connected to the processor 220 and database 115 to the facilitate the transfer of data between the computing entity 200 and database.
[00043] The system 400 and method of the present disclosure may use a plurality user profiles 420 stored within a database. Each user profile 420 may be constructed using resulting data from actions taken by users 430 during a virtual scenario. In a preferred embodiment, each virtual scenario may have a number of performance factors associated therewith that corresponds to an aspect of a professional’s performance. Each performance factor has a plurality of defined performance limits associated therewith. By comparing the actions of the user 430 with the performance factors, the system 400 may grade a user’s 430 performance within the scenario. In this way, the performance factors and performance limits associated therewith may be used to define a performance grade that a particular user 430 achieved during a particular virtual scenario. Performance factors tied to a particular virtual scenario may depend on the type of training and the objective of the virtual scenario. For instance, a virtual scenario involving implanting a heart stent may take speed, location, and invasiveness into consideration when determining the performance grade of a user 430. For instance, a virtual scenario involving flying an airplane may determine a user’s 430 performance grade based on takeoff and landing results.
[00044] In an embodiment, the programming instructions responsible for the operations carried out
by the processor 220 are stored on a non-transitory computer-readable medium 402 (“CRM”), which may be coupled to the processor 220, as shown in FIG. 4. Alternatively, the programming instructions may be stored or included within the processor 220. Examples of non-transitory computer-readable medium 402s include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM discs and DVDs; magneto-optical media such as optical discs; and hardware devices that are specifically configured to store and perform programming instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like. In some embodiments, the programming instructions may be stored as modules within the non-transitory computer-readable medium 402.
[00045] As mentioned previously, computing entities 200 may communicate audibly, meaning computing entities 200 may transmit and receive information via sound waves and covert the sound waves into digital information. For instance, a user 430 may instruct a user interface of a computing entity 200 with their voice to perform a certain action. The processor 220 may convert the sound waves of the user 430 into instructions, which the processor 220 may then carry out. For instance, a user 430 may audibly communicate with the system 400 in a way that causes the system 400 to alter a user’ s 430 virtual position 480 within a virtual reality setting 425. Computing entities 200 may likewise generate audible sound for a user 430, such as through an audio device. Such sound may include sound from voice telephone calls, recorded notes, voice messages, music files, etc. Audible sounds may also include sound generated by applications operating on a computing entity 200. For instance, an application running on a mobile computing entity 200 may be configured in a way such that when a certain condition is met the application causes the mobile computing entity 200 to output a sound. For instance, an application may be configured in a way such that an alarming sound is emitted via an audio device connected to the computing entity 200
at a certain time of day. For instance, the processor 220 may receive a signal indicating that the user 430 is about to make or has made a mistake during the virtual scenario while using the control device 405. The processor 220 may then convert this signal into an audio message that may be sent to an audio device to make the user 430 aware of the mistake.
[00046] As mentioned previously, the system 400 may further comprise a user interface. A user interface may be defined as a space where interactions between a user 430 and the system 400 may take place. In an embodiment, the interactions may take place in a way such that a user 430 may control the operations of the system 400. A user interface may include, but is not limited to operating systems 400, command line user interfaces, conversational interfaces, web-based user interfaces, zooming user interfaces, touch screens, task-based user interfaces, touch user interfaces, text-based user interfaces, intelligent user interfaces, and graphical user interfaces, or any combination thereof. The system 400 may present data of the user interface to the user 430 via a display 316 operably connected to the processor 220. A display 316 may be defined as an output device that communicates data that may include, but is not limited to, visual, auditory, cutaneous, kinesthetic, olfactory, and gustatory, or any combination thereof.
[00047] Information presented via a display 316 may be referred to as a soft copy of the information because the information exists electronically and is presented for a temporary period of time. Information stored on the non-transitory computer-readable medium 402 may be referred to as the hard copy of the information. For instance, a display 316 may present a soft copy of visual information via a liquid crystal display (LCD), wherein the hardcopy of the visual information is stored on a local hard drive. For instance, a display 316 may present a soft copy of audio information via a speaker, wherein the hard copy of the audio information is stored on a flash drive. For instance, a display 316 may present a soft copy of tactile information via a haptic suit,
wherein the hard copy of the tactile information is stored within a database. Displays 316 may include, but are not limited to, cathode ray tube monitors, LCD monitors, light emitting diode (LED) monitors, gas plasma monitors, screen readers, speech synthesizers, haptic suits, virtual reality headsets, speakers, and scent generating devices, or any combination thereof.
[00048] As mentioned previously, the system 400 may comprise a power supply. The power supply may be any source of power that provides the system 400 with electricity. In an embodiment, the power supply may be a stationary power outlet. Additionally, the system 400 may comprise of multiple power supplies that may provide power to the system 400 in different circumstances. For instance, the system 400 may be directly plugged into a stationary power outlet, which may provide power to the system 400 so long as it remains in one place. However, the system 400 may also be connected to a backup battery so that the system 400 may receive power even when the it is not connected to a stationary power outlet.
[00049] To prevent un-authorized user 430 from accessing other user’ s 430 information, the system
400 may employ a security method. As illustrated in FIG. 6, the security method of the system 400 may comprise a plurality of permission levels 600 that may grant user’s 430 access to user content 615, 635, 655 within the database 115 while simultaneously denying users without appropriate permission levels 600 the ability to view user content 615, 635, 655. To access the user content 615, 635, 655 stored within the database 115, users 430 may be required to make a request via a user interface. Access to the data within the database 115 may be granted or denied by the processor 220 based on verification of a requesting user’s 605, 625, 645 permission level 600. If the requesting user’s 605, 625, 645 permission level 600 is sufficient, the processor 220 may provide the requesting user 605, 625, 645 access to user content 315, 335, 355 stored within the database 115 615. Conversely, if the requesting user’s 605, 625, 645 permission level 600 is
insufficient, the processor 220 may deny the requesting user 605, 625, 645 access to user content 615, 635, 655 stored within the database 115 615. In an embodiment, permission levels 600 may be based on user roles 610, 630, 650 and administrator roles 670, as illustrated in FIG. 6. User roles 610, 630, 650 allow requesting users 605, 625, 645 to access user content 615, 635, 655 that a user 430 has uploaded and/or otherwise obtained through use of the system 400. Administrator roles 670 allow administrators 665 to access system 400 wide data.
[00050] In an embodiment, user roles 610, 630, 650 may be assigned to a user in a way such that a requesting user 605, 625, 645 may view user profiles 420 containing virtual scenario results and personal information via a user interface. In an embodiment, the system 400 may be configured to send an instructor a notification indicating that a user has obtained new virtual scenario results. To access the data within the database 115 615, a user may make a user request via the user interface to the processor 220. In an embodiment, the processor 220 may grant or deny the request based on the permission level 600 associated with the requesting user 605, 625, 645. Only users having appropriate user roles 610, 630, 650 or administrator roles 670 may access the data within the user profiles 420. For instance, as illustrated in FIG. 6, requesting user 1 605 has permission to view user 1 content 615 and user 2 content 635 whereas requesting user 2 625 only has permission to view user 2 content 335. Alternatively, user content 615, 635, 655 may be restricted in a way such that a user may only view a limited amount of user content 615, 635, 655. For instance, requesting user 3 645 may be granted a permission level 600 that only allows them to view user 3 content 655 related to personal information but not user 3 content 655 related to virtual scenario results. In the example illustrated in FIG. 6, an administrator 665 may bestow a new permission level 600 on users so that it may grant them greater permissions or lesser permissions. For instance, an administrator 665 may bestow a greater permission level 600 on other users so that they may view
user 3’ s content 655 and/or any other user’ s content 615, 635, 655. Therefore, the permission levels 600 of the system 400 may be assigned to users 430 in various ways without departing from the inventive subject matter described herein.
[00051] FIG. 7 provides a flow chart 700 illustrating certain, preferred method steps that may be used to carry out the method of importing a paired application 415A into a virtual reality setting 425. Step 705 indicates the beginning of the method. During step 710 the processor 220 may execute the application software within the streaming application 415. Once the paired application 415A has been executed within the streaming application 415, the processor 220 may capture video data of the paired application 415A running within the streaming application 415 during step 715. The processor 220 may render the virtual reality setting 425 during step 712 wherein the processor 220 may subsequently stream the previously captured video data to the virtual reality setting 425 during step 720. In an embodiment, the virtual reality setting 425 may be rendered on a computing entity 200 operably connected to the system 400, wherein the paired application 415A may be imported into the virtual reality setting 425 rendered by the computing entity 200 via the processor 220. Once the video data has been streamed into the virtual reality setting 425, the processor 220 may display the data within the virtual screen 425B of the virtual reality setting 425 during step 725. The processor 220 may then transmit the virtual reality setting 425 and the stream running within the virtual screen 425B to a user 430 during step 730. Once received, the display 316 may present the virtual reality setting 425 and the paired application 415A within the virtual screen 425B to the user 430 during step 735. Once displayed to the user 430, the method may proceed to the terminate method step 740.
[00052] FIG. 8 provides a flow chart 800 illustrating certain, preferred method steps that may be used to carry out the method of receiving an input from a control device 405 and transforming that
input into a digital input and virtual reality input. Step 805 indicates the beginning of the method. During step 810, the processor 220 may receive an input from a control device 405 that has been manipulated by a user 430. In a preferred embodiment, a control device 405 may have a plurality of buttons and/or switches that may be manipulated in a way that allows a user 430 to control what input of the control device 405 the system 400 receives. Once the processor 220 has received the input from the control device 405, the processor 220 may convert the input into a computer readable signal via a driver during step 815. The processor 220 preferably converts the input into a function, which may then be converted by the processor 220 into a digital signal. In embodiments of the system 400 comprising a virtual reality setting 425 having a virtual embodiment, the processor 220 may also convert the function into a virtual reality signal as illustrated in FIG. 9. Once the processor 220 has converted the input into a digital signal and virtual reality, the processor 220 may transmit the computer readable signal to the paired application 415A during step 820. The processor 220 may then perform a query during step 822 to determine whether or not there is a virtual embodiment within the virtual reality setting 225. During step 825 the processor 220 may determine what do based on the query. If the virtual reality setting 425 does not comprise a virtual embodiment of a physical machine, the system 400 may proceed to the terminate method step 830. If the system 400 does comprise a virtual reality setting 425 having a virtual embodiment of a physical machine, the system 400 may proceed to step 827, wherein the processor 220 may transmit the virtual reality signal to virtual reality setting 425. Once the virtual reality signal has been transmitted to the virtual reality setting 425, the system 400 may proceed to the terminate method step 830.
[00053] FIG. 9 provides a flow chart 900 illustrating certain, preferred method steps that may be used to carry out the method of importing a paired application 415A into a virtual reality setting
425. Step 905 indicates the beginning of the method. During step 910 a user 430 is provided with the system 400. The user 430 is then provided with a paired application 415A in step 915, wherein the paired application 415A is executed on the system 400. In an embodiment, the paired application 415A is coupled to a physical machine, wherein a virtual embodiment of the physical machine may be manipulated by the paired application 415A within a virtual reality setting 425. In another embodiment, the paired application 415A is executed within a streaming application 415 of the system 400. The user 430 is provided with the control device 405 coupled to a plurality of functions 410 during step 920, wherein manipulation of the control device 405 by the user 430 allows the user 430 to control the paired application 415A. The user 430 may then be provided with the virtual reality setting 425 in step 925, wherein the virtual reality setting 425 is rendered by the system 400. In an embodiment, the virtual reality setting 425 is made specifically for the paired application 415A. The system 400 then renders the virtual reality setting 425 and executes the paired application 415A during step 930.
[00054] Once the system 400 has rendered the virtual reality setting 425 and executed the paired application 415A, the system 400 may stream the paired application 415A to the virtual reality setting 425 during step 935. In an embodiment, the virtual reality setting 425 comprises a virtual screen 425B to which the paired application 415A is streamed. Users 430 within the virtual reality setting 425 may view the data streamed to the virtual screen 425B. The system 400 may then receive inputs from the control device 405 during step 940, which may instruct the system 400 how the user 430 wishes to proceed. In an embodiment, a driver facilitates communication between the control device 405 and the system 400, wherein the inputs of the control device 405 correspond to certain functions within the plurality of functions of the driver. Once the plurality of functions 410 has been received, the system 400 may transform the functions into a computer
readable signal during step 945. In an embodiment, the computer readable signal may be a digital input or a virtual input. The system 400 may then use this computer readable signal to alter the paired application 415A during step 950. In another preferred embodiment, the system 400 may use the computer readable signal to alter the virtual reality setting 425. To ensure that a user 430 may view the virtual reality setting 425 and paired application 415A streamed to the virtual screen 425B, the user 430 is provided with a display 316 in step 955. In a preferred embodiment, the display 316 is a virtual reality headset. Once the user 430 has been presented with a display 316, the system 400 may transmit the virtual reality setting 425 comprising a virtual screen 425B streaming the paired application 415A to the display 316 and present it to the user 430 during step 960. Once the virtual reality setting 425 and paired application 415A have been presented to the user 430, the method may proceed to the terminate method step 965.
[00055] The subject matter described herein may be embodied in system, apparati, methods, and/or articles depending on the desired configuration. In particular, various implementations of the subject matter described herein may be realized in digital electronic circuitry, integrated circuitry, specially designed application specific integrated circuits (ASICs), computer hardware, firmware, software, and/or combinations thereof. These various implementations may include implementation in one or more computer programs that may be executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, and at least one peripheral device.
[00056] These computer programs, which may also be referred to as programs, software, applications, software applications, components, or code, may include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-
oriented programming language, and/or in assembly machine language. As used herein, the term “non-transitory computer-readable medium” refers to any computer program, product, apparatus, and/or device, such as magnetic discs, optical disks, memory, and Programmable Logic Devices (PLDs), used to provide machine instructions and/or data to a programmable processor, including a non-transitory computer-readable medium that receives machine instructions as a computer- readable signal. The term“computer-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor. To provide for interaction with a user, the subject matter described herein may be implemented on a computer having a display device, such as a cathode ray tube (CRD), liquid crystal display (LCD), light emitting display (LED) monitor for displaying information to the user and a keyboard and a pointing device, such as a mouse or a trackball, by which the user may provide input to the computer. Displays may include, but are not limited to, visual, auditory, cutaneous, kinesthetic, olfactory, and gustatory displays, or any combination thereof.
[00057] Other kinds of devices may be used to facilitate interaction with a user as well. For instance, feedback provided to the user may be any form of sensory feedback, such as visual feedback, auditory feedback, or tactile feedback; and input from the user may be received in any form including, but not limited to, acoustic, speech, or tactile input. The subject matter described herein may be implemented in a computing system that includes a back-end component, such as a data server, or that includes a middleware component, such as an application server, or that includes a front-end component, such as a client computer having a graphical user interface or a Web browser through which a user may interact with the system described herein, or any combination of such back-end, middleware, or front-end components. The components of the system may be interconnected by any form or medium of digital data communication, such as a communication
network. Examples of communication networks may include, but are not limited to, a local area network (“LAN”), a wide area network (“WAN”), metropolitan area networks (“MAN”), and the internet.
[00058] The implementations set forth in the foregoing description do not represent all implementations consistent with the subject matter described herein. Instead, they are merely some examples consistent with aspects related to the described subject matter. Although a few variations have been described in detail above, other modifications or additions are possible. In particular, further features and/or variations can be provided in addition to those set forth herein. For instance, the implementations described above can be directed to various combinations and subcombinations of the disclosed features and/or combinations and subcombinations of several further features disclosed above. In addition, the logic flow depicted in the accompanying figures and/or described herein do not necessarily require the particular order shown, or sequential order, to achieve desirable results. It will be readily understood to those skilled in the art that various other changes in the details, ices, and arrangements of the parts and method stages which have been described and illustrated in order to explain the nature of this inventive subject matter can be made without departing from the principles and scope of the inventive subject matter.
What is claimed is:
1) A system for importing a software application into a virtual reality setting, said system comprising: an expansion port,
a processor operably connected to said control device via said expansion port,
a control device operably connected to said expansion port and having a plurality of functions stored thereon,
Claims
w W,nai W i¾O v^i 20i2ii0i/c0u60 i¾5.69 PCT/US2018/052305
1) A system for importing a software application into a virtual reality setting, said system comprising: an expansion port, a processor operably connected to said control device via said expansion port,
a control device operably connected to said expansion port and having a plurality of functions stored thereon,
a power supply,
a non-transitory computer-readable medium coupled to said processor,
wherein said non-transitory computer-readable medium contains instructions stored thereon, which, when executed by said processor, cause said processor to perform operations comprising:
receiving said software application, executing said software application such that it is usable by a user, receiving said plurality of functions from said control device as said control device is manipulated by said user,
transforming said plurality of functions of said control device into a digital input,
wherein said digital input directs said processor how to manipulate said software application, manipulating said software application using said digital input,
streaming said software application to said computing entity having said virtual reality setting
wherein said software application is imported into said virtual reality setting by said computing entity.
2) The system of claim 1, wherein said computing entity contains said software application and said virtual reality setting,
wherein said computing entity executes said software application and renders said virtual reality setting,
wherein said computing entity imports said software application into said virtual reality setting,
wherein said computing entity streams said virtual reality setting and said software application that have been integrated into one another to said processor,
wherein said computing entity alters said software application based on said digital inputs, wherein said non-transitory computer-readable medium contains additional instructions, which, when executed by said processor, cause said processor to perform additional operations comprising streaming said virtual reality setting and said software application from said computing entity and transmitting said digital input to said computing entity.
3) The system of claim 2, wherein said computing entity alters said virtual reality setting based on a virtual input,
wherein said non-transitory computer-readable medium contains additional instructions, which, when executed by said processor, cause said processor to perform additional operations comprising:
transforming said plurality of functions of said control device into said virtual input, wherein said virtual input manipulates a virtual embodiment of a physical machine within said virtual reality setting, and
transmitting said virtual input to said computing entity.
4) The system of claim 1, wherein said computing entity comprises:
a display,
a secondary processor operably connected to said display,
a secondary power supply,
a secondary non-transitory computer-readable medium coupled to said secondary processor,
wherein said secondary non-transitory computer-readable medium contains instructions stored thereon, which, when executed by said secondary processor, cause said secondary processor to perform operations comprising receiving said virtual reality setting,
rendering said virtual reality setting such that it is virtually navigable by said user,
receiving a stream of said software application from said processor, importing said stream of said software application into said virtual reality setting while said virtual reality setting is being rendered,
wherein said software application is presented via a virtual screen, presenting said virtual reality setting and said software application to said user simultaneously via said display.
5) The system of claim 4, wherein said non-transitory computer-readable medium further comprises additional instructions, which, when executed by said processor, cause said processor to perform additional operations comprising
transforming said plurality of functions of said control device into a virtual input,
wherein said virtual input manipulates a virtual embodiment of a physical machine within said virtual reality setting, and
transmitting said virtual input to said secondary processor.
6) The system of claim 5, wherein said secondary non-transitory computer-readable medium further comprises additional instructions, which, when executed by said secondary processor, cause said secondary processor to perform additional operations comprising
receiving said virtual input from said processor,
altering said virtual embodiment based on said virtual input,
wherein altering said virtual embodiment alters said virtual reality setting.
7) The system of claim 6, wherein said virtual reality setting comprises a virtual reality environment and a plurality of virtual objects,
wherein said plurality of virtual objects comprise a virtual reality display having said virtual screen,
wherein said user alters a virtual position within said virtual reality environment to move about said virtual reality environment.
8) The system of claim 7, wherein said display comprises a virtual reality headset, wherein said virtual reality headset presents said virtual position within said virtual reality setting, wherein changing a spatial position of said virtual reality headset causes said processor to alter said virtual position presented by said virtual reality headset.
9) The system of claim 8, wherein said virtual reality display of said virtual reality setting is spatially manipulatable within said virtual reality setting such that said virtual position of said virtual reality display and said virtual screen are simultaneously and contemporaneously altered.
10) The system of claim 9, wherein said software application is rendered within said virtual screen in a way such that changing said virtual position of said virtual reality display simultaneously and contemporaneously alters the rendering of said software application within said virtual screen as viewed by said user from said virtual position.
11) A system for importing a software application into a virtual reality setting, the system comprising: a display,
a processor operably connected to said display,
a power supply,
a control device operably connected to said processor and having a plurality of functions stored thereon,
wherein said plurality of functions of said control device are activated by a user via manipulation of said control device,
wherein said control device transmits said plurality of functions activated by said user to said processor,
a non-transitory computer-readable medium coupled to said processor,
wherein said non-transitory computer-readable medium contains said software application,
thereon, which, when executed by said processor, cause said processor to perform operations comprising
receiving said virtual reality setting,
rendering said virtual reality setting such that it is virtually navigable by said user,
receiving said software application,
executing said software application such that it usable by said user, receiving said plurality of functions from said control device as said control device is manipulated by said user,
transforming said plurality of functions of said control device into a digital input,
wherein said digital input manipulates said software application,
importing said software application into said virtual reality setting while said software application is being executed and said virtual reality setting is being rendered,
wherein said software application is presented via a virtual screen, presenting said virtual reality setting and said software application via said display, and
manipulating said software application using said digital input.
12) The system of claim 11, wherein said non-transitory computer-readable medium further comprises additional instructions, which, when executed by said processor, cause said processor to perform additional operations comprising
transforming said plurality of functions of said control device into a virtual input,
wherein said virtual input manipulates a virtual embodiment of a physical machine within said virtual reality setting, and
altering said virtual embodiment based on said virtual input,
wherein altering said virtual embodiment alters said virtual reality setting.
13) The system of claim 12, wherein said control device is specifically designed to operate said software application, wherein said software application is specifically designed to operate said physical machine.
14) The system of claim 11, further comprising a computing entity operably connected to said processor,
wherein said computing entity contains said virtual reality setting rendered by said processor, wherein said computing entity transmits said virtual reality setting to said processor.
15) The system of claim 11, further comprising a computing entity operably connected to said processor,
wherein said computing entity contains said software application executed by said processor, wherein said computing entity transmits said software application to said processor.
16) The system of claim 11, further comprising a computing entity operably connected to said processor,
wherein said computing entity contains said software application and said virtual reality setting,
wherein said computing entity executes said software application and renders said virtual reality setting,
wherein said computing entity imports said software application into said virtual reality setting, wherein said computing entity streams said virtual reality setting and said software application to said processor,
wherein said computing entity alters said software application based on said digital inputs, wherein said non-transitory computer-readable medium contains additional instructions, which, when executed by said processor, cause said processor to perform additional operations comprising
streaming said virtual reality setting and said software application from said computing entity to said processor, and
transmitting said digital input to said computing entity.
17) The system of claim 16, wherein said computing entity alters said virtual reality setting based on said virtual input,
wherein said non-transitory computer-readable medium contains additional instructions, which, when executed by said processor, cause said processor to perform additional operations comprising transmitting said virtual input to said computing entity.
18) The system of claim 11, wherein said virtual reality setting comprises a virtual reality environment and a plurality of virtual objects,
wherein said plurality of virtual objects comprises a virtual reality display having said virtual screen,
wherein said user alters a virtual position within said virtual reality environment to move about said virtual reality environment.
19) The system of claim 18, wherein said display comprises a virtual reality headset, wherein said virtual reality headset presents said virtual position within said virtual reality setting, wherein changing a spatial position of said virtual reality headset causes said processor to alter said virtual position presented by said virtual reality headset.
20) The system of claim 19, wherein said virtual reality display of said virtual reality setting is spatially manipulatable within said virtual reality setting such that said virtual position of said virtual reality display and said virtual screen are simultaneously and contemporaneously altered.
21) The system of claim 20, wherein said software application is rendered within said virtual screen such that changing said virtual position of said virtual reality display simultaneously and contemporaneously alters the rendering of said software application within said virtual screen as viewed by said user from said virtual position.
22) A method for importing a software application into a virtual reality setting, said method comprising the steps of:
providing a computing entity,
wherein said computing entity is capable of rendering a virtual reality setting,
wherein said computing entity is capable of streaming video from another said computing entity,
providing said software application,
wherein said software application is stored on said computing entity, providing a control device having a plurality of functions,
wherein said control device is specifically made for said software application, wherein said control device transmits a plurality of functions to said computing entity as said control device is manipulated,
creating said virtual reality setting,
wherein said virtual reality setting comprises a virtual screen for presenting said software application,
rendering said virtual reality setting in a way such that it is virtually navigable,
wherein said virtual reality setting comprises a virtual reality environment having a plurality of virtual objects,
importing said software application into said virtual reality setting,
wherein said software application is presented within said virtual reality setting via said virtual screen,
receiving said plurality of functions from said control device as said control device is manipulated,
transforming said plurality of functions into a digital input, and
altering said software application based on said digital input.
23) The method of claim 22, further comprising the steps of:
providing a display,
wherein said display allows navigation of said virtual reality setting, and
presenting said virtual reality setting and said software application via a display.
24) The method of claim 22, further comprising the steps of
transforming said plurality of functions of said control device into a virtual input,
wherein said virtual input manipulates a virtual embodiment of a physical machine within said virtual reality setting, and
altering said virtual embodiment based on said virtual input,
wherein altering said virtual embodiment alters said virtual reality setting.
25) The method of claim 24, further comprising the steps of
providing a display,
wherein said display allows navigation of said virtual reality setting, and presenting said virtual reality setting and said software application via said display.
26) The method of claim 25, further comprising steps of
changing a spatial position of said display,
wherein said spatial position of said display coincides with a virtual position within said virtual reality setting, and
altering said virtual position within said virtual reality setting based on said spatial position of said display.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2018/052305 WO2020060569A1 (en) | 2018-09-21 | 2018-09-21 | System and method for importing a software application into a virtual reality setting |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2018/052305 WO2020060569A1 (en) | 2018-09-21 | 2018-09-21 | System and method for importing a software application into a virtual reality setting |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020060569A1 true WO2020060569A1 (en) | 2020-03-26 |
Family
ID=69887643
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2018/052305 WO2020060569A1 (en) | 2018-09-21 | 2018-09-21 | System and method for importing a software application into a virtual reality setting |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020060569A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114464043A (en) * | 2022-01-04 | 2022-05-10 | 大连斗牛科技有限公司 | Virtual simulation experience system for railcar design |
CN114504821A (en) * | 2022-01-17 | 2022-05-17 | 深圳市锐昌智能科技有限公司 | Method and device for controlling warning operation of virtual object in UE4 virtual reality |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170206708A1 (en) * | 2016-01-19 | 2017-07-20 | Immersv, Inc. | Generating a virtual reality environment for displaying content |
US20170213473A1 (en) * | 2014-09-08 | 2017-07-27 | SimX, Inc. | Augmented and virtual reality simulator for professional and educational training |
US20170256096A1 (en) * | 2016-03-07 | 2017-09-07 | Google Inc. | Intelligent object sizing and placement in a augmented / virtual reality environment |
US20170263033A1 (en) * | 2016-03-10 | 2017-09-14 | FlyInside, Inc. | Contextual Virtual Reality Interaction |
US20180033204A1 (en) * | 2016-07-26 | 2018-02-01 | Rouslan Lyubomirov DIMITROV | System and method for displaying computer-based content in a virtual or augmented environment |
-
2018
- 2018-09-21 WO PCT/US2018/052305 patent/WO2020060569A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170213473A1 (en) * | 2014-09-08 | 2017-07-27 | SimX, Inc. | Augmented and virtual reality simulator for professional and educational training |
US20170206708A1 (en) * | 2016-01-19 | 2017-07-20 | Immersv, Inc. | Generating a virtual reality environment for displaying content |
US20170256096A1 (en) * | 2016-03-07 | 2017-09-07 | Google Inc. | Intelligent object sizing and placement in a augmented / virtual reality environment |
US20170263033A1 (en) * | 2016-03-10 | 2017-09-14 | FlyInside, Inc. | Contextual Virtual Reality Interaction |
US20180033204A1 (en) * | 2016-07-26 | 2018-02-01 | Rouslan Lyubomirov DIMITROV | System and method for displaying computer-based content in a virtual or augmented environment |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114464043A (en) * | 2022-01-04 | 2022-05-10 | 大连斗牛科技有限公司 | Virtual simulation experience system for railcar design |
CN114504821A (en) * | 2022-01-17 | 2022-05-17 | 深圳市锐昌智能科技有限公司 | Method and device for controlling warning operation of virtual object in UE4 virtual reality |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20190332400A1 (en) | System and method for cross-platform sharing of virtual assistants | |
US9396356B2 (en) | Endorsement of unmodified photographs using watermarks | |
AU2019201980B2 (en) | A collaborative virtual environment | |
US10585939B2 (en) | Real time object description service integrated with knowledge center on augmented reality (AR) and virtual reality (VR) devices | |
US20160162702A1 (en) | Managing access permissions to class notebooks and their section groups in a notebook application | |
US9472119B2 (en) | Computer-implemented operator training system and method of controlling the system | |
US10650118B2 (en) | Authentication-based presentation of virtual content | |
EP2851890A1 (en) | System and method for providing augmentation based learning content | |
CN102567459B (en) | Presentation process as context for presenter and audience | |
National Research Council et al. | Continuing innovation in information technology | |
CN109905429B (en) | System for team safety training | |
WO2020060569A1 (en) | System and method for importing a software application into a virtual reality setting | |
US20160034434A1 (en) | Contextual page generation with structured content independent of responsive display devices | |
An | Adopting metaverse‐related mixed reality technologies to tackle urban development challenges: An empirical study of an Australian municipal government | |
US20230161824A1 (en) | Management of data access using a virtual reality system | |
US20180059775A1 (en) | Role-based provision of virtual reality environment | |
EP4298568A1 (en) | Interactive avatar training system | |
CN112272328B (en) | Bullet screen recommendation method and related device | |
CN109918949A (en) | Recognition methods, device, electronic equipment and storage medium | |
US20130093774A1 (en) | Cloud-based animation tool | |
US20230244325A1 (en) | Learned computer control using pointing device and keyboard actions | |
CN204721386U (en) | The device of the micro-letter client of binding is realized based on Quick Response Code | |
CN111193791A (en) | Training system based on B/S architecture and information display method | |
CN210072615U (en) | Immersive training system and wearable equipment | |
US12028379B2 (en) | Virtual reality gamification-based security need simulation and configuration in any smart surrounding |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18934243 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18934243 Country of ref document: EP Kind code of ref document: A1 |