US20180356878A1 - Apparatus and method for recording and replaying interactive content in augmented/virtual reality in industrial automation systems and other systems - Google Patents

Apparatus and method for recording and replaying interactive content in augmented/virtual reality in industrial automation systems and other systems Download PDF

Info

Publication number
US20180356878A1
US20180356878A1 US15/942,155 US201815942155A US2018356878A1 US 20180356878 A1 US20180356878 A1 US 20180356878A1 US 201815942155 A US201815942155 A US 201815942155A US 2018356878 A1 US2018356878 A1 US 2018356878A1
Authority
US
United States
Prior art keywords
user
commands
actions
space
visual objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/942,155
Inventor
Subhan Basha Dudekula
Arif Shuail Ahamed
Ramesh Babu Koniki
Manas Dutta
Mark Phillips
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honeywell International Inc
Original Assignee
Honeywell International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honeywell International Inc filed Critical Honeywell International Inc
Priority to US15/942,155 priority Critical patent/US20180356878A1/en
Assigned to HONEYWELL INTERNATIONAL INC. reassignment HONEYWELL INTERNATIONAL INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DUTTA, Manas, AHAMED, Arif Shuail, DUDEKULA, Subhan Basha, KONIKI, RAMESH BABU, PHILLIPS, MARK
Priority to CN201880033203.7A priority patent/CN110678827B/en
Priority to EP18814310.1A priority patent/EP3635520A4/en
Priority to PCT/US2018/034871 priority patent/WO2018226448A1/en
Publication of US20180356878A1 publication Critical patent/US20180356878A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/11File system administration, e.g. details of archiving or snapshots
    • G06F16/116Details of conversion of file system types or formats
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/13File access structures, e.g. distributed indices
    • G06F17/30091
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/105Human resources
    • G06Q10/1053Employment or hiring
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/535Tracking the activity of the user

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Human Computer Interaction (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Technology (AREA)
  • Educational Administration (AREA)
  • Strategic Management (AREA)
  • Data Mining & Analysis (AREA)
  • Marketing (AREA)
  • Databases & Information Systems (AREA)
  • Tourism & Hospitality (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Economics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • General Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method includes receiving data defining user actions associated with an AR/VR space. The method also includes translating the user actions into associated commands and identifying associations of the commands with visual objects in the AR/VR space. The method further includes aggregating the commands, the associations of the commands with the visual objects, and an AR/VR environment setup into at least one file and transmitting or storing the file(s). Another method includes receiving at least one file containing commands, associations of the commands with visual objects in an AR/VR space, and an AR/VR environment setup. The other method also includes translating the commands into associated user actions and recreating or causing a user device to recreate (i) the AR/VR space containing the visual objects based on the AR/VR environment setup and (ii) the user actions in the AR/VR space based on the associations of the commands with the visual objects.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY CLAIM
  • This application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/517,006, U.S. Provisional Patent Application No. 62/517,015, and U.S. Provisional Patent Application No. 62/517,037, all filed on Jun. 8, 2017. These provisional applications are hereby incorporated by reference in their entirety.
  • TECHNICAL FIELD
  • This disclosure generally relates to augmented reality and virtual reality systems. More specifically, this disclosure relates to an apparatus and method for recording and replaying interactive content in augmented/virtual reality in industrial automation systems and other systems.
  • BACKGROUND
  • Augmented reality and virtual reality technologies are advancing rapidly and becoming more and more common in various industries. Augmented reality generally refers to technology in which computer-generated content is superimposed over a real-world environment. Examples of augmented reality include games that superimpose objects or characters over real-world images and navigation tools that superimpose information over real-world images. Virtual reality generally refers to technology that creates an artificial simulation or recreation of an environment, which may or may not be a real-world environment. An example of virtual reality includes games that create fantasy or alien environments that can be explored by users.
  • SUMMARY
  • This disclosure provides an apparatus and method for recording and replaying interactive content in augmented/virtual reality in industrial automation systems and other systems.
  • In a first embodiment, a method includes receiving data defining user actions associated with an augmented reality/virtual reality (AR/VR) space. The method also includes translating the user actions into associated commands and identifying associations of the commands with visual objects in the AR/VR space. The method further includes aggregating the commands, the associations of the commands with the visual objects, and an AR/VR environment setup into at least one file. In addition, the method includes transmitting or storing the at least one file.
  • In a second embodiment, an apparatus includes at least one processing device configured to receive data defining user actions associated with an AR/VR space. The at least one processing device is also configured to translate the user actions into associated commands and to identify associations of the commands with visual objects in the AR/VR space. The at least one processing device is further configured to aggregate the commands, the associations of the commands with the visual objects, and an AR/VR environment setup into at least one file. In addition, the at least one processing device is configured to transmit or store the at least one file.
  • In a third embodiment, a method includes receiving at least one file containing commands, associations of the commands with visual objects in an AR/VR space, and an AR/VR environment setup. The method also includes translating the commands into associated user actions. In addition, the method includes recreating or causing a user device to recreate (i) the AR/VR space containing the visual objects based on the AR/VR environment setup and (ii) the user actions in the AR/VR space based on the associations of the commands with the visual objects.
  • In a fourth embodiment, an apparatus includes at least one processing device configured to receive at least one file containing commands, associations of the commands with visual objects in an AR/VR space, and an AR/VR environment setup. The at least one processing device is also configured to translate the commands into associated user actions. In addition, the at least one processing device is configured to recreate or causing a user device to recreate (i) the AR/VR space containing the visual objects based on the AR/VR environment setup and (ii) the user actions in the AR/VR space based on the associations of the commands with the visual objects.
  • In a fifth embodiment, a non-transitory computer readable medium contains instructions that when executed cause at least one processing device to perform the method of the first embodiment or any of its dependent claims. In a sixth embodiment, a non-transitory computer readable medium contains instructions that when executed cause at least one processing device to perform the method of the third embodiment or any of its dependent claims.
  • Other technical features may be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • For a more complete understanding of this disclosure, reference is now made to the following description, taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 illustrates an example architecture for recording interactive content in augmented/virtual reality according to this disclosure;
  • FIG. 2 illustrates an example architecture for replaying interactive content in augmented/virtual reality according to this disclosure;
  • FIG. 3 illustrates an example device that supports recording or replaying of interactive content in augmented/virtual reality according to this disclosure; and
  • FIGS. 4 and 5 illustrate example methods for recording and replaying interactive content in augmented/virtual reality according to this disclosure.
  • DETAILED DESCRIPTION
  • FIGS. 1 through 5, discussed below, and the various embodiments used to describe the principles of the present invention in this patent document are by way of illustration only and should not be construed in any way to limit the scope of the invention. Those skilled in the art will understand that the principles of the invention may be implemented in any type of suitably arranged device or system.
  • As noted above, augmented reality and virtual reality technologies are advancing rapidly, and various potential uses for augmented reality and virtual reality technologies have been devised. However, some of those potential uses have a number of technical limitations or problems. For example, augmented reality and virtual reality technologies could be used to help train humans to perform various tasks, such as controlling an industrial process or repairing equipment. Unfortunately, the number of vendors providing augmented/virtual reality technologies is increasing quickly, and there are no industry standards for input mechanisms like gestures, voice commands, voice annotations, or textual messages. This poses a technical problem in attempting to create, record, and replay user actions as part of training content or other interactive content. Ideally, the interactive content should be device-agnostic to allow the interactive content to be recorded from and replayed on various augmented/virtual reality devices. Alternatives such as video/image recording of a user's actions for distribution in an augmented/virtual environment pose various challenges, such as in terms of storage space, processor computation, and transmission bandwidth. These challenges can be particularly difficult when there is a large amount of interactive content to be recorded and replayed.
  • This disclosure provides techniques for recording, storing, and distributing users' actions in augmented/virtual reality environments. Among other things, this disclosure describes a portable file format that captures content such as user inputs, data formats, and training setups. The portable file format allows for easier storage, computation, and distribution of interactive content and addresses technical constraints with respect to space, computation, and bandwidth.
  • FIG. 1 illustrates an example architecture 100 for recording interactive content in augmented/virtual reality according to this disclosure. As shown in FIG. 1, the architecture 100 includes a training environment 102, which denotes a visualization layer that allows interaction with an augmented reality/virtual reality (AR/VR) space. In this example, the training environment 102 can include one or more end user devices, such as at least one AR/VR headset 104, at least one computing device 106, or at least one interactive AR/VR system 108. Each headset 104 generally denotes a device that is worn by a user and that displays an AR/VR space. The headset 104 in FIG. 1 is a MICROSOFT HOLOLENS device, although any other suitable AR/VR device could be used. Each computing device 106 generally denotes a device that processes data to present an AR/VR space (although not necessarily in a 3D format) to a user. Each computing device 106 denotes any suitable computing device, such as a desktop computer, laptop computer, tablet computer, or smartphone. Each interactive AR/VR system 108 includes a headset and one or more user input devices, such as interactive or smart gloves. Although not shown, one or more input devices could also be used with the headset 104 or the computing device 106.
  • The architecture 100 also includes at least one processor, such as in a server 110, that is used to record training content or other interactive content. The server 110 generally denotes a computing device that receives content from the training environment 102 and records and processes the content. The server 110 includes various functions or modules to support the recording and processing of interactive content. Each of these functions or modules could be implemented in any suitable manner, such as with software/firmware instructions executed by one or more processors. The server 110 could be positioned locally with or remote from the training environment 102.
  • Functionally, the server 110 includes a user input receiver 112, which receives, processes, and filters user inputs made by the user. The user inputs could include any suitable inputs, such as gestures made by the user, voice commands or voice annotations spoken by the user, textual messages provided by the user, or pointing actions taken by the user using a pointing device (such as a smart glove). Any other or additional user inputs could also be received. The user inputs can be filtered in any suitable manner and are output to an input translator 114. To support the use of the architecture 100 by a wide range of users, input variants (like voice/text in different languages) could be supported. The user input receiver 112 includes any suitable logic for receiving and processing user inputs.
  • The input translator 114 translates the various user inputs into specific commands by referring to a standard action grammar reference 116. The grammar reference 116 represents an actions-to-commands mapping dictionary that associates different user input actions with different commands. For example, the grammar reference 116 could associate certain spoken words, text messages, or physical actions with specific commands. The grammar reference 116 could support one or multiple possibilities for commands where applicable, such as when different commands may be associated with the same spoken words or text messages but different physical actions. The grammar reference 116 includes any suitable mapping or other association of actions and commands. The input translator 114 includes any suitable logic for identifying commands associated with received user inputs.
  • The input translator 114 outputs identified commands to an aggregator 118. The aggregator 118 associates the commands with visual objects in the AR/VR space being presented to the user into one or more training modules 120. The aggregator 118 also embeds an AR/VR environment setup into the one or more training modules 120. The AR/VR environment setup can define what visual objects are to be presented in the AR/VR space. The training modules 120 therefore associate specific commands (which were generated based on user inputs) with specific visual objects in the AR/VR space as defined by the environment setup. The aggregator 118 includes any suitable logic for aggregating data.
  • The training modules 120 are created in a portable file format, which allows the training modules 120 to be used by various other user devices. For example, the data in the training modules 120 can be used by other user devices to recreate the AR/VR space and the actions taken in the AR/VR space. Effectively, this allows training content or other interactive content in the modules 120 to be provided to various users for training purposes or other purposes. The portable file format could be defined in any suitable manner, such as by using XML or JSON.
  • The training modules 120 could be used in various ways. In this example, the training modules 120 are transferred over a network (such as via a local intranet or a public network like the Internet) for storage in a database 122. The database 122 could be local to the training environment 102 and the server 110 or remote from one or both of these components. As a particular example, the database 122 could be used within a cloud computing environment 124 or other remote network. Also, the database 122 could be accessed via a training service application programming interface (API) 126. The API 126 denotes a web interface that allows uploading or downloading of training modules 120. Note, however, that the training modules 120 could be stored or used in any other suitable manner.
  • Based on this, the following process could be performed using the various components in FIG. 1. An application executing in an AR/VR space, on a mobile device, or on any other suitable device initiates a recording and sends user input action details (such as gestures, voice, and textual messages) to the user input receiver 112. The user input receiver 112 detects and tracks the user input actions (such as gestures, voice, textual messages, and pointing device actions), filters the actions as needed, and passes the selected/filtered actions to the input translator 114. The input translator 114 converts the user actions into system-understandable commands by referring to the grammar reference 116, and the input translator 114 passes these commands to the aggregator 118. The aggregator 118 associates the system-understandable commands to visual objects, embeds the AR/VR environment setup, and prepares one or more training modules 120 in a portable file format. The training modules 120 are processed and stored on-site or remotely.
  • In this way, the architecture 100 can be used to record and store one or more users' actions in one or more AR/VR environments. As a result, training data and other data associated with the AR/VR environments can be easily captured, stored, and distributed in the training modules 120. Other devices and systems can use the training modules 120 to recreate the AR/VR environments (either automatically or in a user-driven manner) and allow other people to view the users' actions in the AR/VR environments.
  • The training modules 120 can occupy significantly less space in memory and require significantly less bandwidth for transmission and storage compared to alternatives such as video/image recording. Moreover, the training modules 120 can be used to recreate the AR/VR environments and the users' actions in the AR/VR environments with significantly less computational requirements compared to alternatives such as video/image reconstruction and playback. These features can provide significant technical advantages, such as in systems that use large amounts of interactive data in a number of AR/VR environments.
  • Although FIG. 1 illustrates one example of an architecture 100 for recording interactive content in augmented/virtual reality, various changes may be made to FIG. 1. For example, the architecture 100 could support any number of training environments 102, headsets 104, computing devices 106, AR/VR systems 108, servers 110, or other components. Also, the training modules 120 could be used in any other suitable manner. In addition, while described as being used with or including a training environment 102 and generating training modules 120, the architecture 100 could be used with or include any suitable environment 102 and be used to generate any suitable modules 120 containing interactive content (whether or not used for training purposes).
  • FIG. 2 illustrates an example architecture 200 for replaying interactive content in augmented/virtual reality according to this disclosure. The architecture 200 in FIG. 2 is similar to the architecture 100 in FIG. 1, but the architecture 200 in FIG. 2 is used to replay interactive content that could have been captured using the architecture 100 in FIG. 1.
  • As shown in FIG. 2, the architecture 200 includes a training environment 202, which may or may not be similar to the training environment 102 described above. In this example, the training environment 202 includes at least one headset 204, at least one computing device 206, or at least one interactive AR/VR system 208. Note that these devices 204-208 may or may not be the same as the devices 104-108 in FIG. 1. Since the training modules 120 are in a portable file format, the training modules 120 can be generated and used by different types of devices.
  • The architecture 200 also includes at least one processor, such as in a server 210, that is used to replay training content or other interactive content. For example, the server 210 could receive one or more training modules 120 (such as from the storage 122 via the API 126) and replay the interactive content from the modules 120 for one or more users. The server 210 includes various functions or modules to support the replay of interactive content. Each of these functions or modules could be implemented in any suitable manner, such as with software/firmware instructions executed by one or more processors. The server 210 could be positioned locally with or remote from the training environment 202. The server 210 could also denote the server 110 in FIG. 1, allowing the server 110/210 to both record and replay content.
  • Functionally, the server 210 includes a disassembler 218, which separates each training module 120 into separate data elements. The separate data elements could relate to various aspects of an AR/VR space, such as data related to the visual environment overall, data related to specific visual objects, and commands. The disassembler 218 can output the data related to the visual environment and the visual objects to the training environment 202. The training environment 202 can use this information to cause the appropriate user device 204-208 to recreate the overall visual environment and the visual objects in the visual environment within an AR/VR space being presented by the user device. The disassembler 218 can also output commands to a command translator 214. The disassembler 218 includes any suitable logic for separating data in training modules.
  • The command translator 214 translates the various commands into specific user actions by referring to the standard action grammar reference 116. This allows the command translator 214 to map the commands back into user actions, effectively reversing the mapping done by the input translator 114. The command translator 214 includes any suitable logic for identifying user actions associated with received commands.
  • The command translator 214 outputs the user actions to an action performer 212. The action performer 212 interacts with the training environment 202 to cause the appropriate user device 204-208 to render the identified user actions and replay the user actions within the AR/VR space being presented by the user device. At least some of the user actions in the AR/VR space can be recreated based on the associations of the commands with specific visual objects in the AR/VR space. This allows the AR/VR environment to be recreated for the user based on the interactive content in a training module 120. The user could, for example, see how someone else controls an industrial process or repairs equipment. To support the use of the architecture 200 by a wide range of users, output variants (like voice/text in different languages) could be supported. The action performer 212 includes any suitable logic for creating actions within an AR/VR environment.
  • Based on this, the following process could be performed using the various components in FIG. 2. An application executing in an AR/VR space, on a mobile device, or on any other suitable device initiates the replay of a training module 120, such as via the API 126. The API 126 fetches the appropriate training module 120 and passes it to the disassembler 218. The disassembler 218 separates the training module 120 into data related to the visual environment, visual objects, and commands. The disassembler 218 passes the visual environment and visual object details to the training environment 202 and passes the commands to the command translator 214. The command translator 214 converts the commands to user actions by referring to the grammar reference 116 and passes the user actions to the action performer 212. The action performer 212 renders the user actions and replays them in the visual environment.
  • In this way, the architecture 200 can be used to recreate one or more people's actions in one or more AR/VR environments. As a result, training data and other data associated with the AR/VR environments can be easily obtained and used to recreate the AR/VR environments, allowing users to view other people's actions in the AR/VR environments. The training modules 120 can occupy significantly less space in memory and require significantly less bandwidth for reception and storage compared to alternatives such as video/image recording. Moreover, the training modules 120 can be used to recreate the AR/VR environments and people's actions in the AR/VR environments with significantly less computational requirements compared to alternatives such as video/image reconstruction and playback. These features can provide significant technical advantages, such as in systems that use large amounts of interactive data in a number of AR/VR environments.
  • Although FIG. 2 illustrates one example of an architecture 200 for replaying interactive content in augmented/virtual reality, various changes may be made to FIG. 2. For example, the architecture 200 could support any number of training environments 202, headsets 204, computing devices 206, AR/VR systems 208, servers 210, or other components. Also, the training modules 120 could be used in any other suitable manner. In addition, while described as being used with or including a training environment 202 and using training modules 120, the architecture 200 could be used with or include any suitable environment 202 and be used with any suitable modules 120 containing interactive content (whether or not used for training purposes).
  • Note that while the architectures 100 and 200 in FIGS. 1 and 2 are shown separately with different user devices 104-108/204-208, the architectures 100 and 200 could be implemented together. In such a case, a single server 110/210 could both capture content associated with a specific user and replay content from other users. This may allow, for example, a single user to both (i) send data identifying what that user is doing in his or her AR/VR environment and (ii) receive data identifying what one or more other users are doing in their AR/VR environments.
  • Also note that while the recording of training content and the later playback of that training content is one example use of the devices and techniques described above, other uses of the devices and techniques are also possible. For example, these devices and techniques could allow the server 110 to generate training content or other interactive content that is streamed or otherwise provided in real-time or near real-time to a server 210 for playback. This may allow, for instance, a first user to demonstrate actions in an AR/VR space that are then recreated in the AR/VR space for a second user. If desired, feedback can be provided from the second user to the first user, which may allow the first user to repeat or expand on certain actions. As another example, these devices and techniques could be used to record and recreate users' actions in any suitable AR/VR space, and the users' actions may or may not be used for training purposes.
  • This technology can find use in a number of ways in industrial automation settings or other settings. For example, control and safety systems and related instrumentations used in industrial plants (such as refinery, petrochemical, and pharmaceutical plants) are often very complex in nature. It may take a lengthy period of time (such as more than five years) to train new system maintenance personnel to become proficient in managing plant and system upsets independently. Combining such long delays with a growing number of experienced personnel retiring in the coming years means that industries are facing acute skill shortages and increased plant upsets due to the lack of experience and skill.
  • Traditional classroom training, whether face-to-face or online, often requires personnel to be away from the field for an extended time (such as 20 to 40 hours). In many cases, this is not practical, particularly for plants that are already facing resource and funding challenges due to overtime, travel, or other issues. Also, few sites have powered-on and functioning control hardware for training. Due to the fast rate of change for technology, it may no longer be cost-effective to procure and maintain live training systems.
  • Simulating control and safety system hardware in the AR/VR space, building dynamics of real hardware modules in virtual objects, and interfacing the AR/VR space with real supervisory systems (such as engineering and operator stations) can provide various benefits. For example, it can reduce or eliminate any dependency on real hardware for competency management. It can also “gamify” the learning of complex and mundane control and safety system concepts, which can help to keep trainees engaged. It can further decrease the time needed to become proficient in control and safety system maintenance through more hands-on practice sessions and higher retention of the training being imparted.
  • This represents example ways in which the devices and techniques described above could be used. However, these examples are non-limiting, and the devices and techniques described above could be used in any other suitable manner. In general, the devices and techniques described in this patent document could be applicable whenever one or more user actions in an AR/VR space are to be recorded, stored, and recreated in an AR/VR space for one or more other users (for whatever purpose).
  • FIG. 3 illustrates an example device 300 that supports recording or replaying of interactive content in augmented/virtual reality according to this disclosure. For example, the device 300 could represent either or both of the servers 110, 210 described above. Note that the same device 300 could be used to both record and replay training content or other interactive content, although this is not required.
  • As shown in FIG. 3, the device 300 includes at least one processing device 302, at least one storage device 304, at least one communications unit 306, and at least one input/output (I/O) unit 308. The processing device 302 executes instructions that may be loaded into a memory 310, such as instructions that (when executed by the processing device 302) implement the functions of the server 110 and/or the server 210. The processing device 302 includes any suitable number(s) and type(s) of processors or other devices in any suitable arrangement. Example types of processing devices 302 include microprocessors, microcontrollers, digital signal processors, field programmable gate arrays, application specific integrated circuits, and discrete circuitry.
  • The memory 310 and a persistent storage 312 are examples of storage devices 304, which represent any structure(s) capable of storing and facilitating retrieval of information (such as data, program code, and/or other suitable information on a temporary or permanent basis). The memory 310 may represent a random access memory or any other suitable volatile or non-volatile storage device(s). The persistent storage 312 may contain one or more components or devices supporting longer-term storage of data, such as a read only memory, hard drive, Flash memory, or optical disc.
  • The communications unit 306 supports communications with other systems or devices. For example, the communications unit 306 could include a network interface card or a wireless transceiver facilitating communications over a wired or wireless network (such as a local intranet or a public network like the Internet). The communications unit 306 may support communications through any suitable physical or wireless communication link(s).
  • The I/O unit 308 allows for input and output of data. For example, the I/O unit 308 may provide a connection for user input through a keyboard, mouse, keypad, touchscreen, or other suitable input device. The I/O unit 308 may also send output to a display, printer, or other suitable output device.
  • Although FIG. 3 illustrates one example of a device 300 that supports recording or replaying of interactive content in augmented/virtual reality, various changes may be made to FIG. 3. For example, computing devices come in a wide variety of configurations, and FIG. 3 does not limit this disclosure to any particular computing device.
  • FIGS. 4 and 5 illustrate example methods for recording and replaying interactive content in augmented/virtual reality according to this disclosure. In particular, FIG. 4 illustrates an example method 400 for recording interactive content in augmented/virtual reality, and FIG. 5 illustrates an example method 500 for replaying interactive content in augmented/virtual reality. For ease of explanation, the methods 400 and 500 are described as being performed using the device 300 operating as the server 110 in FIG. 1 (method 400) or as the server 210 in FIG. 2 (method 500). However, the methods 400 and 500 could be used with any suitable devices and in any suitable systems.
  • As shown in FIG. 4, a recording of user actions related to an AR/VR space is initiated at step 402. This could include, for example, the processing device 302 of the server 110 receiving an indication from a user device 104-108 that a user wishes to initiate the recording. Information defining an AR/VR environment setup is received at step 404. This could include, for example, the processing device 302 of the server 110 receiving information identifying the overall visual environment of the AR/VR space being presented to the user by the user device 104-108 and information identifying visual objects in the AR/VR space being presented to the user by the user device 104-108.
  • Information defining user actions associated with the AR/VR environment is received at step 406. This could include, for example, the processing device 302 of the server 110 receiving information identifying how the user is interacting with one or more of the visual objects presented in the AR/VR space by the user device 104-108. The interactions could take on various forms, such as the user making physical gestures, speaking voice commands, speaking voice annotations, or providing textual messages. This information is used to detect, track, and filter the user actions at step 408. This could include, for example, the processing device 302 of the server 110 processing the received information to identify distinct gestures, voice commands, voice annotations, or textual messages that occur. This could also include the processing device 302 of the server 110 processing the received information to identify visual objects presented in the AR/VR space that are associated with those user actions.
  • The user actions are translated into commands at step 410. This could include, for example, the processing device 302 of the server 110 using the standard action grammar reference 116 and its actions-to-commands mapping dictionary to associate different user actions with different commands. Specific commands are associated with specific visual objects presented in the AR/VR space at step 412. This could include, for example, the processing device 302 of the server 110 associating specific ones of the identified commands with specific ones of the visual objects presented in the AR/VR space. This allows the server 110 to identify which visual objects are associated with the identified commands.
  • At least one file is generated that contains the commands, the associations of the commands with the visual objects, and the AR/VR environment setup at step 414. This could include, for example, the processing device 302 of the server 110 generating a module 120 containing this information. The at least one file is output, stored, or used in some manner at step 416. This could include, for example, the processing device 302 of the server 110 storing the module 120 in a memory or database 122, streaming the module 120 to one or more destinations, or using the module 120 to recreate the user actions in another person's AR/VR space.
  • As shown in FIG. 5, a replay of user actions related to an AR/VR space is initiated at step 502. This could include, for example, the processing device 302 of the server 210 receiving an indication from a user device 204-208 that a user wishes to initiate the replay. A suitable file containing commands, associations of the commands with visual objects, and an AR/VR environment setup is obtained at step 504. This could include, for example, the processing device 302 of the server 210 obtaining a module 120 from the database 122, such as via the API 126. The specific module 120 could be identified in any suitable manner, such as based on a specific module or topic identified by the server 210.
  • The contents of the file are separated at step 506. This could include, for example, the processing device 302 of the server 210 separating the data related to the AR/VR environment setup, the visual objects, and the commands. The commands are translated into user actions at step 508. This could include, for example, the processing device 302 of the server 210 using the standard action grammar reference 116 to associate different commands with different user actions. The specific commands (and therefore the specific user actions) are associated with specific visual objects to be presented in the AR/VR space based on the association data contained in the module 120.
  • The information related to the AR/VR environment setup and the visual objects is passed to a user device at step 510. This could include, for example, the processing device 302 of the server 210 passing the information to the user device 204-208. The user device recreates an AR/VR space based on the AR/VR environment setup and the visual objects at step 512, and the user device recreates the user actions in the AR/VR space at step 514. This could include, for example, the user device 204-208 creating an overall visual environment using the AR/VR environment setup and displaying visual objects within the visual environment. This could also include the action performer 212 causing the user device 204-208 to recreate specific user actions in association with specific visual objects within the AR/VR environment.
  • Although FIGS. 4 and 5 illustrate examples of methods 400 and 500 for recording and replaying interactive content in augmented/virtual reality, various changes may be made to FIGS. 4 and 5. For example, while each figure illustrates a series of steps, various steps in each figure could overlap, occur in parallel, occur in a different order or occur any number of times.
  • In some embodiments, various functions described in this patent document are implemented or supported by a computer program that is formed from computer readable program code and that is embodied in a computer readable medium. The phrase “computer readable program code” includes any type of computer code, including source code, object code, and executable code. The phrase “computer readable medium” includes any type of medium capable of being accessed by a computer, such as read only memory (ROM), random access memory (RAM), a hard disk drive, a compact disc (CD), a digital video disc (DVD), or any other type of memory. A “non-transitory” computer readable medium excludes wired, wireless, optical, or other communication links that transport transitory electrical or other signals. A non-transitory computer readable medium includes media where data can be permanently stored and media where data can be stored and later overwritten, such as a rewritable optical disc or an erasable storage device.
  • It may be advantageous to set forth definitions of certain words and phrases used throughout this patent document. The terms “application” and “program” refer to one or more computer programs, software components, sets of instructions, procedures, functions, objects, classes, instances, related data, or a portion thereof adapted for implementation in a suitable computer code (including source code, object code, or executable code). The term “communicate,” as well as derivatives thereof, encompasses both direct and indirect communication. The terms “include” and “comprise,” as well as derivatives thereof, mean inclusion without limitation. The term “or” is inclusive, meaning and/or. The phrase “associated with,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, have a relationship to or with, or the like. The phrases “at least one of” and “one or more of,” when used with a list of items, mean that different combinations of one or more of the listed items may be used, and only one item in the list may be needed. For example, “at least one of: A, B, and C” includes any of the following combinations: A, B, C, A and B, A and C, B and C, and A and B and C.
  • The description in the present application should not be read as implying that any particular element, step, or function is an essential or critical element that must be included in the claim scope. The scope of patented subject matter is defined only by the allowed claims. Moreover, none of the claims invokes 35 U.S.C. § 112(f) with respect to any of the appended claims or claim elements unless the exact words “means for” or “step for” are explicitly used in the particular claim, followed by a participle phrase identifying a function. Use of terms such as (but not limited to) “mechanism,” “module,” “device,” “unit,” “component,” “element,” “member,” “apparatus,” “machine,” “system,” “processor,” or “controller” within a claim is understood and intended to refer to structures known to those skilled in the relevant art, as further modified or enhanced by the features of the claims themselves, and is not intended to invoke 35 U.S.C. § 112(f).
  • While this disclosure has described certain embodiments and generally associated methods, alterations and permutations of these embodiments and methods will be apparent to those skilled in the art. Accordingly, the above description of example embodiments does not define or constrain this disclosure. Other changes, substitutions, and alterations are also possible without departing from the spirit and scope of this disclosure, as defined by the following claims.

Claims (20)

What is claimed is:
1. A method comprising:
receiving data defining user actions associated with an augmented reality/virtual reality (AR/VR) space;
translating the user actions into associated commands;
identifying associations of the commands with visual objects in the AR/VR space;
aggregating the commands, the associations of the commands with the visual objects, and an AR/VR environment setup into at least one file; and
transmitting or storing the at least one file.
2. The method of claim 1, wherein the at least one file has a portable file format.
3. The method of claim 1, wherein the data defining the user actions comprises one or more of:
data defining one or more gestures made by a user;
data defining one or more voice commands or voice annotations spoken by the user;
data defining one or more textual messages provided by the user; and
data defining one or more pointing actions taken by the user using at least one pointing device.
4. The method of claim 1, wherein translating the user actions into the associated commands comprises using a grammar reference that associates different user input actions with different commands.
5. The method of claim 1, further comprising:
using the at least one file to recreate the user actions in the AR/VR space.
6. The method of claim 5, wherein recreation of the user actions is performed automatically or is driven by a user viewing the recreated user actions.
7. The method of claim 1, wherein:
the receiving, translating, identifying, aggregating, and transmitting occur using at least one of: a personal computer, a laptop computer, a server, a headset, a mobile device, and a computing cloud; and
transmitting or storing the at least one file comprises transmitting the at least one file over at least one of a local intranet or a public network.
8. An apparatus comprising:
at least one processing device configured to:
receive data defining user actions associated with an augmented reality/virtual reality (AR/VR) space;
translate the user actions into associated commands;
identify associations of the commands with visual objects in the AR/VR space;
aggregate the commands, the associations of the commands with the visual objects, and an AR/VR environment setup into at least one file; and
transmit or store the at least one file.
9. The apparatus of claim 8, wherein the data defining the user actions comprises one or more of:
data defining one or more gestures made by a user;
data defining one or more voice commands or voice annotations spoken by the user;
data defining one or more textual messages provided by the user; and
data defining one or more pointing actions taken by the user using at least one pointing device.
10. The apparatus of claim 8, wherein, to translate the user actions into the associated commands, the at least one processing device is configured to use a grammar reference that associates different user input actions with different commands.
11. A method comprising:
receiving at least one file containing commands, associations of the commands with visual objects in an augmented reality/virtual reality (AR/VR) space, and an AR/VR environment setup;
translating the commands into associated user actions; and
recreating or causing a user device to recreate (i) the AR/VR space containing the visual objects based on the AR/VR environment setup and (ii) the user actions in the AR/VR space based on the associations of the commands with the visual objects.
12. The method of claim 11, wherein the at least one file has a portable file format.
13. The method of claim 11, wherein the user actions comprise:
one or more gestures made by a user;
one or more voice commands or voice annotations spoken by the user;
one or more textual messages provided by the user; and
one or more pointing actions taken by the user using at least one pointing device.
14. The method of claim 11, wherein translating the commands into the associated user actions comprises using a grammar reference that associates different user input actions with different commands.
15. The method of claim 11, wherein recreation of the user actions is performed automatically or is driven by a user viewing the recreated user actions.
16. The method of claim 11, wherein:
the receiving, translating, and recreating or causing to recreate occur using at least one of: a personal computer, a laptop computer, a server, a headset, a mobile device, and a computing cloud; and
receiving the at least one file comprises receiving the at least one file over at least one of a local intranet or a public network.
17. An apparatus comprising:
at least one processing device configured to:
receive at least one file containing commands, associations of the commands with visual objects in an augmented reality/virtual reality (AR/VR) space, and an AR/VR environment setup;
translate the commands into associated user actions; and
recreate or causing a user device to recreate (i) the AR/VR space containing the visual objects based on the AR/VR environment setup and (ii) the user actions in the AR/VR space based on the associations of the commands with the visual objects.
18. The apparatus of claim 17, wherein the user actions comprise:
one or more gestures made by a user;
one or more voice commands or voice annotations spoken by the user;
one or more textual messages provided by the user; and
one or more pointing actions taken by the user using at least one pointing device.
19. The apparatus of claim 17, wherein, to translate the commands into the associated user actions, the at least one processing device is configured to use a grammar reference that associates different user input actions with different commands.
20. The apparatus of claim 17, wherein the at least one processing device is configured to:
separate the commands, the associations of the commands with the visual objects, and the AR/VR environment setup;
transmit the AR/VR environment setup to the user device to allow the user device to recreate the AR/VR space containing the visual objects; and
cause the user device to recreate the user actions based on the associations of the commands with the visual objects.
US15/942,155 2017-06-08 2018-03-30 Apparatus and method for recording and replaying interactive content in augmented/virtual reality in industrial automation systems and other systems Abandoned US20180356878A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US15/942,155 US20180356878A1 (en) 2017-06-08 2018-03-30 Apparatus and method for recording and replaying interactive content in augmented/virtual reality in industrial automation systems and other systems
CN201880033203.7A CN110678827B (en) 2017-06-08 2018-05-29 Apparatus and method for recording and playback of interactive content with augmented/virtual reality in industrial automation systems and other systems
EP18814310.1A EP3635520A4 (en) 2017-06-08 2018-05-29 Apparatus and method for recording and replaying interactive content in augmented/virtual reality in industrial automation systems and other systems
PCT/US2018/034871 WO2018226448A1 (en) 2017-06-08 2018-05-29 Apparatus and method for recording and replaying interactive content in augmented/virtual reality in industrial automation systems and other systems

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201762517006P 2017-06-08 2017-06-08
US201762517015P 2017-06-08 2017-06-08
US201762517037P 2017-06-08 2017-06-08
US15/942,155 US20180356878A1 (en) 2017-06-08 2018-03-30 Apparatus and method for recording and replaying interactive content in augmented/virtual reality in industrial automation systems and other systems

Publications (1)

Publication Number Publication Date
US20180356878A1 true US20180356878A1 (en) 2018-12-13

Family

ID=64563592

Family Applications (3)

Application Number Title Priority Date Filing Date
US15/941,545 Abandoned US20180357922A1 (en) 2017-06-08 2018-03-30 Apparatus and method for assessing and tracking user competency in augmented/virtual reality-based training in industrial automation systems and other systems
US15/942,155 Abandoned US20180356878A1 (en) 2017-06-08 2018-03-30 Apparatus and method for recording and replaying interactive content in augmented/virtual reality in industrial automation systems and other systems
US15/941,327 Active 2038-04-30 US10748443B2 (en) 2017-06-08 2018-03-30 Apparatus and method for visual-assisted training, collaboration, and monitoring in augmented/virtual reality in industrial automation systems and other systems

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/941,545 Abandoned US20180357922A1 (en) 2017-06-08 2018-03-30 Apparatus and method for assessing and tracking user competency in augmented/virtual reality-based training in industrial automation systems and other systems

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/941,327 Active 2038-04-30 US10748443B2 (en) 2017-06-08 2018-03-30 Apparatus and method for visual-assisted training, collaboration, and monitoring in augmented/virtual reality in industrial automation systems and other systems

Country Status (1)

Country Link
US (3) US20180357922A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110782722A (en) * 2019-09-30 2020-02-11 南京浩伟智能科技有限公司 Teaching system and teaching method based on AR system
US10726631B1 (en) * 2019-08-03 2020-07-28 VIRNECT inc. Augmented reality system and method with frame region recording and reproduction technology based on object tracking
US20210264810A1 (en) * 2018-06-26 2021-08-26 Rebecca Johnson Method and system for generating a virtual reality training session
US20220245051A1 (en) * 2021-02-01 2022-08-04 Raytheon Company Annotated deterministic trace abstraction for advanced dynamic program analysis
US11894130B2 (en) 2019-12-26 2024-02-06 Augmenticon Gmbh Pharmaceutical manufacturing process control, support and analysis

Families Citing this family (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11900022B2 (en) 2017-02-22 2024-02-13 Middle Chart, LLC Apparatus for determining a position relative to a reference transceiver
US10949579B2 (en) 2017-02-22 2021-03-16 Middle Chart, LLC Method and apparatus for enhanced position and orientation determination
US11900021B2 (en) 2017-02-22 2024-02-13 Middle Chart, LLC Provision of digital content via a wearable eye covering
US10902160B2 (en) 2017-02-22 2021-01-26 Middle Chart, LLC Cold storage environmental control and product tracking
US10740502B2 (en) 2017-02-22 2020-08-11 Middle Chart, LLC Method and apparatus for position based query with augmented reality headgear
US10831945B2 (en) 2017-02-22 2020-11-10 Middle Chart, LLC Apparatus for operation of connected infrastructure
US10620084B2 (en) 2017-02-22 2020-04-14 Middle Chart, LLC System for hierarchical actions based upon monitored building conditions
WO2020068177A1 (en) 2018-09-26 2020-04-02 Middle Chart, LLC Method and apparatus for augmented virtual models and orienteering
US11436389B2 (en) 2017-02-22 2022-09-06 Middle Chart, LLC Artificial intelligence based exchange of geospatial related digital content
US11194938B2 (en) 2020-01-28 2021-12-07 Middle Chart, LLC Methods and apparatus for persistent location based digital content
US11468209B2 (en) 2017-02-22 2022-10-11 Middle Chart, LLC Method and apparatus for display of digital content associated with a location in a wireless communications area
US11625510B2 (en) 2017-02-22 2023-04-11 Middle Chart, LLC Method and apparatus for presentation of digital content
US10984146B2 (en) 2017-02-22 2021-04-20 Middle Chart, LLC Tracking safety conditions of an area
US10762251B2 (en) * 2017-02-22 2020-09-01 Middle Chart, LLC System for conducting a service call with orienteering
US10824774B2 (en) 2019-01-17 2020-11-03 Middle Chart, LLC Methods and apparatus for healthcare facility optimization
US11475177B2 (en) 2017-02-22 2022-10-18 Middle Chart, LLC Method and apparatus for improved position and orientation based information display
US10740503B1 (en) 2019-01-17 2020-08-11 Middle Chart, LLC Spatial self-verifying array of nodes
US10628617B1 (en) 2017-02-22 2020-04-21 Middle Chart, LLC Method and apparatus for wireless determination of position and orientation of a smart device
US10872179B2 (en) 2017-02-22 2020-12-22 Middle Chart, LLC Method and apparatus for automated site augmentation
US11481527B2 (en) 2017-02-22 2022-10-25 Middle Chart, LLC Apparatus for displaying information about an item of equipment in a direction of interest
US10528700B2 (en) 2017-04-17 2020-01-07 Rockwell Automation Technologies, Inc. Industrial automation information contextualization method and system
US11169507B2 (en) 2017-06-08 2021-11-09 Rockwell Automation Technologies, Inc. Scalable industrial analytics platform
US10824867B1 (en) * 2017-08-02 2020-11-03 State Farm Mutual Automobile Insurance Company Augmented reality system for real-time damage assessment
US10691303B2 (en) * 2017-09-11 2020-06-23 Cubic Corporation Immersive virtual environment (IVE) tools and architecture
US11144042B2 (en) 2018-07-09 2021-10-12 Rockwell Automation Technologies, Inc. Industrial automation information contextualization method and system
US20200133451A1 (en) * 2018-10-25 2020-04-30 Autodesk, Inc. Techniques for analyzing the proficiency of users of software applications
CN109800377A (en) * 2019-01-02 2019-05-24 珠海格力电器股份有限公司 A kind of message based on AR delivers, inspection method and server and mobile terminal
US11625806B2 (en) * 2019-01-23 2023-04-11 Qualcomm Incorporated Methods and apparatus for standardized APIs for split rendering
US11403541B2 (en) 2019-02-14 2022-08-02 Rockwell Automation Technologies, Inc. AI extensions and intelligent model validation for an industrial digital twin
US11086298B2 (en) 2019-04-15 2021-08-10 Rockwell Automation Technologies, Inc. Smart gateway platform for industrial internet of things
US11048483B2 (en) 2019-09-24 2021-06-29 Rockwell Automation Technologies, Inc. Industrial programming development with an extensible integrated development environment (IDE) platform
US10942710B1 (en) 2019-09-24 2021-03-09 Rockwell Automation Technologies, Inc. Industrial automation domain-specific language programming paradigm
US11080176B2 (en) 2019-09-26 2021-08-03 Rockwell Automation Technologies, Inc. Testing framework for automation objects
US11392112B2 (en) 2019-09-26 2022-07-19 Rockwell Automation Technologies, Inc. Virtual design environment
US11163536B2 (en) 2019-09-26 2021-11-02 Rockwell Automation Technologies, Inc. Maintenance and commissioning
US11042362B2 (en) 2019-09-26 2021-06-22 Rockwell Automation Technologies, Inc. Industrial programming development with a trained analytic model
US11733687B2 (en) 2019-09-26 2023-08-22 Rockwell Automation Technologies, Inc. Collaboration tools
US11841699B2 (en) * 2019-09-30 2023-12-12 Rockwell Automation Technologies, Inc. Artificial intelligence channel for industrial automation
US11435726B2 (en) 2019-09-30 2022-09-06 Rockwell Automation Technologies, Inc. Contextualization of industrial data at the device level
US11249462B2 (en) 2020-01-06 2022-02-15 Rockwell Automation Technologies, Inc. Industrial data services platform
US11640486B2 (en) 2021-03-01 2023-05-02 Middle Chart, LLC Architectural drawing based exchange of geospatial related digital content
US11507714B2 (en) 2020-01-28 2022-11-22 Middle Chart, LLC Methods and apparatus for secure persistent location based digital content
US11308447B2 (en) 2020-04-02 2022-04-19 Rockwell Automation Technologies, Inc. Cloud-based collaborative industrial automation design environment
CN111538412B (en) * 2020-04-21 2023-12-15 北京恒华伟业科技股份有限公司 VR-based safety training method and device
US11887365B2 (en) * 2020-06-17 2024-01-30 Delta Electronics, Inc. Method for producing and replaying courses based on virtual reality and system thereof
US11726459B2 (en) 2020-06-18 2023-08-15 Rockwell Automation Technologies, Inc. Industrial automation control program generation from computer-aided design
US10930066B1 (en) 2020-09-11 2021-02-23 Mythical, Inc. Systems and methods for using natural language processing (NLP) to automatically generate three-dimensional objects in a virtual space
FR3114663A1 (en) * 2020-09-29 2022-04-01 Technip France System for virtual evaluation of an industrial process intended to be implemented in an industrial installation and associated method
US11077367B1 (en) * 2020-10-09 2021-08-03 Mythical, Inc. Systems and methods for using natural language processing (NLP) to control automated gameplay
RU205195U1 (en) * 2020-12-04 2021-07-01 Олеся Владимировна Чехомова VISUALIZER FOR VOICE CONTROL OF ANIMATED TATTOO
US11783001B2 (en) 2021-07-08 2023-10-10 Bank Of America Corporation System and method for splitting a video stream using breakpoints based on recognizing workflow patterns
US11951376B2 (en) 2022-06-10 2024-04-09 Crush Ar Llc Mixed reality simulation and training system

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5769640A (en) * 1992-12-02 1998-06-23 Cybernet Systems Corporation Method and system for simulating medical procedures including virtual reality and control method and system for use therein
US6345207B1 (en) * 1997-07-15 2002-02-05 Honda Giken Kogyo Kabushiki Kaisha Job aiding apparatus
US6480191B1 (en) * 1999-09-28 2002-11-12 Ricoh Co., Ltd. Method and apparatus for recording and playback of multidimensional walkthrough narratives
US20040051745A1 (en) * 2002-09-18 2004-03-18 Ullas Gargi System and method for reviewing a virtual 3-D environment
US20040100507A1 (en) * 2001-08-24 2004-05-27 Omri Hayner System and method for capturing browser sessions and user actions
US20040164897A1 (en) * 2003-02-24 2004-08-26 Simon Treadwell Apparatus and method for recording real time movements and experiences for subsequent replay in a virtual reality domain
US20040175684A1 (en) * 2001-07-11 2004-09-09 Johannes Kaasa System and methods for interactive training of procedures
US20050021281A1 (en) * 2001-12-05 2005-01-27 Wolfgang Friedrich System and method for establising a documentation of working processes for display in an augmented reality system in particular in a production assembly service or maintenance enviroment
US20050060719A1 (en) * 2003-09-12 2005-03-17 Useractive, Inc. Capturing and processing user events on a computer system for recording and playback
US20090210483A1 (en) * 2008-02-15 2009-08-20 Sony Ericsson Mobile Communications Ab Systems Methods and Computer Program Products for Remotely Controlling Actions of a Virtual World Identity
US20100205529A1 (en) * 2009-02-09 2010-08-12 Emma Noya Butin Device, system, and method for creating interactive guidance with execution of operations
US20120304059A1 (en) * 2011-05-24 2012-11-29 Microsoft Corporation Interactive Build Instructions
US20130203026A1 (en) * 2012-02-08 2013-08-08 Jpmorgan Chase Bank, Na System and Method for Virtual Training Environment
US20140282008A1 (en) * 2011-10-20 2014-09-18 Koninklijke Philips N.V. Holographic user interfaces for medical procedures
US20150260474A1 (en) * 2014-03-14 2015-09-17 Lineweight Llc Augmented Reality Simulator
US20150310751A1 (en) * 2014-04-24 2015-10-29 Indu Tolia Augmented reality assisted education content creation and management
US20160364321A1 (en) * 2014-02-20 2016-12-15 Hewlett Packard Enterprise Development Lp Emulating a user performing spatial gestures
US20170157512A1 (en) * 2015-12-06 2017-06-08 Sliver VR Technologies, Inc. Methods and systems for computer video game streaming, highlight, and replay
US20170213473A1 (en) * 2014-09-08 2017-07-27 SimX, Inc. Augmented and virtual reality simulator for professional and educational training
US9767613B1 (en) * 2015-01-23 2017-09-19 Leap Motion, Inc. Systems and method of interacting with a virtual object
US20180276895A1 (en) * 2017-03-27 2018-09-27 Global Tel*Link Corporation Personalized augmented reality in a controlled environment

Family Cites Families (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5751289A (en) 1992-10-01 1998-05-12 University Corporation For Atmospheric Research Virtual reality imaging system with image replay
JP2000122520A (en) 1998-10-14 2000-04-28 Mitsubishi Heavy Ind Ltd Virtual reality simulator and simulation method therefor
DE19900884A1 (en) 1999-01-12 2000-07-20 Siemens Ag System and method for operating and observing an automation system with process visualization and process control using virtual plant models as an image of a real plant
JP2002538543A (en) 1999-03-02 2002-11-12 シーメンス アクチエンゲゼルシヤフト System and method for contextually assisting dialogue with enhanced reality technology
US6356437B1 (en) 1999-03-29 2002-03-12 Siemens Dematic Postal Automation, L.P. System, apparatus and method for providing a portable customizable maintenance support instruction system
DE19953739C2 (en) 1999-11-09 2001-10-11 Siemens Ag Device and method for object-oriented marking and assignment of information to selected technological components
US6823280B2 (en) 2000-01-24 2004-11-23 Fluor Corporation Control system simulation, testing, and operator training
US7126558B1 (en) 2001-10-19 2006-10-24 Accenture Global Services Gmbh Industrial augmented reality
US7117135B2 (en) 2002-05-14 2006-10-03 Cae Inc. System for providing a high-fidelity visual display coordinated with a full-scope simulation of a complex system and method of using same for training and practice
US7203560B1 (en) 2002-06-04 2007-04-10 Rockwell Automation Technologies, Inc. System and methodology facilitating remote and automated maintenance procedures in an industrial controller environment
SE0203908D0 (en) 2002-12-30 2002-12-30 Abb Research Ltd An augmented reality system and method
JP2005134536A (en) 2003-10-29 2005-05-26 Omron Corp Work training support system
JP4227046B2 (en) 2004-03-11 2009-02-18 三菱電機株式会社 Maintenance training system
KR100721713B1 (en) 2005-08-25 2007-05-25 명지대학교 산학협력단 Immersive training system for live-line workers
US9323055B2 (en) 2006-05-26 2016-04-26 Exelis, Inc. System and method to display maintenance and operational instructions of an apparatus using augmented reality
SE0601216L (en) 2006-05-31 2007-12-01 Abb Technology Ltd Virtual workplace
DE102006045503A1 (en) 2006-09-27 2008-04-03 Abb Technology Ag System and method for integrating process control systems into a training simulation
SG144001A1 (en) 2006-12-29 2008-07-29 Yokogawa Electric Corp An operator training apparatus for a manufacturing environment and a method of use thereof
US8370207B2 (en) 2006-12-30 2013-02-05 Red Dot Square Solutions Limited Virtual reality system including smart objects
US20080218331A1 (en) 2007-03-08 2008-09-11 Itt Manufacturing Enterprises, Inc. Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness
US8390534B2 (en) 2007-03-08 2013-03-05 Siemens Aktiengesellschaft Method and device for generating tracking configurations for augmented reality applications
EP2203878B1 (en) 2007-09-18 2017-09-13 VRMEDIA S.r.l. Information processing apparatus and method for remote technical assistance
WO2009036782A1 (en) 2007-09-18 2009-03-26 Vrmedia S.R.L. Information processing apparatus and method for remote technical assistance
WO2009103089A1 (en) 2008-02-15 2009-08-20 Invensys Systems, Inc. System and method for autogenerating simulations for process control system checkout and operator training
WO2009155483A1 (en) 2008-06-20 2009-12-23 Invensys Systems, Inc. Systems and methods for immersive interaction with actual and/or simulated facilities for process, environmental and industrial control
US8055375B2 (en) 2008-09-30 2011-11-08 Rockwell Automation Technologies, Inc. Analytical generator of key performance indicators for pivoting on metrics for comprehensive visualizations
CA2795826C (en) 2010-04-08 2015-08-11 Vrsim, Inc. Simulator for skill-oriented training
US9529424B2 (en) 2010-11-05 2016-12-27 Microsoft Technology Licensing, Llc Augmented reality with direct user interaction
CN102136204A (en) 2011-02-25 2011-07-27 中国人民解放军第二炮兵工程学院 Virtual maintenance distribution interactive simulation support platform of large equipment and collaborative maintenance method
US20140004487A1 (en) 2011-03-25 2014-01-02 Joseph M. Cheben Immersive Training Environment
US9529348B2 (en) 2012-01-24 2016-12-27 Emerson Process Management Power & Water Solutions, Inc. Method and apparatus for deploying industrial plant simulators using cloud computing technologies
CN102881202B (en) 2012-10-09 2015-05-20 江苏省电力公司徐州供电公司 Virtual reality based on-load tap switch maintenance training method
US9400495B2 (en) 2012-10-16 2016-07-26 Rockwell Automation Technologies, Inc. Industrial automation equipment and machine procedure simulation
EP2936445A2 (en) 2012-12-20 2015-10-28 Accenture Global Services Limited Context based augmented reality
US20140320529A1 (en) * 2013-04-26 2014-10-30 Palo Alto Research Center Incorporated View steering in a combined virtual augmented reality system
US9438648B2 (en) 2013-05-09 2016-09-06 Rockwell Automation Technologies, Inc. Industrial data analytics in a cloud platform
TWI484452B (en) 2013-07-25 2015-05-11 Univ Nat Taiwan Normal Learning system of augmented reality and method thereof
US9472119B2 (en) 2013-08-26 2016-10-18 Yokogawa Electric Corporation Computer-implemented operator training system and method of controlling the system
ITBO20130466A1 (en) 2013-08-29 2015-03-01 Umpi Elettronica Societa A Respo Nsabilita Lim METHOD OF INSPECTION AND / OR MAINTENANCE OF A PART OF AN INDUSTRIAL PLANT BY INCREASED REALITY, AND CORRESPONDING SYSTEM TO GUIDE THE INSPECTION AND / OR MAINTENANCE OF THE INDUSTRIAL PLANT
JP6159217B2 (en) 2013-10-11 2017-07-05 三菱重工業株式会社 Plant operation training apparatus, control method, program, and plant operation training system
US9489832B2 (en) 2014-04-04 2016-11-08 Rockwell Automation Technologies, Inc. Industrial-enabled mobile device
WO2015160515A1 (en) 2014-04-16 2015-10-22 Exxonmobil Upstream Research Company Methods and systems for providing procedures in real-time
US20150325047A1 (en) 2014-05-06 2015-11-12 Honeywell International Inc. Apparatus and method for providing augmented reality for maintenance applications
JP2016011989A (en) 2014-06-27 2016-01-21 三菱電機株式会社 Operation training device
US9576329B2 (en) 2014-07-31 2017-02-21 Ciena Corporation Systems and methods for equipment installation, configuration, maintenance, and personnel training
US20160071319A1 (en) 2014-09-09 2016-03-10 Schneider Electric It Corporation Method to use augumented reality to function as hmi display
JP6320270B2 (en) 2014-10-14 2018-05-09 三菱電機株式会社 Driving training simulator system
EP3281403A4 (en) * 2015-04-06 2018-03-07 Scope Technologies US Inc. Methods and apparatus for augmented reality applications
KR101806356B1 (en) 2015-04-27 2017-12-15 주식회사 셈웨어 Virtual plant simulator using block diagrams and plc simulator and simulator executing method using the same
US10181326B2 (en) * 2015-06-01 2019-01-15 AffectLayer, Inc. Analyzing conversations to automatically identify action items
KR20170005920A (en) 2015-07-06 2017-01-17 주식회사 톨레미시스템 The plant safety Training Method and System.
US20170148214A1 (en) 2015-07-17 2017-05-25 Ivd Mining Virtual reality training
US11222551B2 (en) 2015-07-23 2022-01-11 Rockwell Automation Technologies, Inc. Snapshot management architecture for process control operator training system lifecycle
US11064009B2 (en) 2015-08-19 2021-07-13 Honeywell International Inc. Augmented reality-based wiring, commissioning and monitoring of controllers
US20170090970A1 (en) 2015-09-30 2017-03-30 Yokogawa Electric Corporation Method, system and computer program for cloud based computing clusters for simulated operator training systems
EP3151217A1 (en) 2015-10-02 2017-04-05 Siemens Aktiengesellschaft Operator training system
CN105225551A (en) 2015-11-12 2016-01-06 武汉科迪奥电力科技有限公司 Based on the high-tension switch cabinet Training Methodology of virtual reality technology
US9573062B1 (en) 2015-12-06 2017-02-21 Silver VR Technologies, Inc. Methods and systems for virtual reality streaming and replay of computer video games
US9473758B1 (en) 2015-12-06 2016-10-18 Sliver VR Technologies, Inc. Methods and systems for game video recording and virtual reality replay
US10795449B2 (en) * 2015-12-11 2020-10-06 Google Llc Methods and apparatus using gestures to share private windows in shared virtual environments
US20170249785A1 (en) 2016-02-29 2017-08-31 Vreal Inc Virtual reality session capture and replay systems and methods
US10720184B2 (en) 2016-03-16 2020-07-21 Intel Corporation Selective recording of augmented reality objects
US9971960B2 (en) * 2016-05-26 2018-05-15 Xesto Inc. Method and system for providing gesture recognition services to user applications
US20180203238A1 (en) * 2017-01-18 2018-07-19 Marshall Leroy Smith, JR. Method of education and simulation learning
US20180324229A1 (en) * 2017-05-05 2018-11-08 Tsunami VR, Inc. Systems and methods for providing expert assistance from a remote expert to a user operating an augmented reality device
US11270601B2 (en) * 2017-06-29 2022-03-08 Verb Surgical Inc. Virtual reality system for simulating a robotic surgical environment

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5769640A (en) * 1992-12-02 1998-06-23 Cybernet Systems Corporation Method and system for simulating medical procedures including virtual reality and control method and system for use therein
US6345207B1 (en) * 1997-07-15 2002-02-05 Honda Giken Kogyo Kabushiki Kaisha Job aiding apparatus
US6480191B1 (en) * 1999-09-28 2002-11-12 Ricoh Co., Ltd. Method and apparatus for recording and playback of multidimensional walkthrough narratives
US20040175684A1 (en) * 2001-07-11 2004-09-09 Johannes Kaasa System and methods for interactive training of procedures
US20040100507A1 (en) * 2001-08-24 2004-05-27 Omri Hayner System and method for capturing browser sessions and user actions
US20050021281A1 (en) * 2001-12-05 2005-01-27 Wolfgang Friedrich System and method for establising a documentation of working processes for display in an augmented reality system in particular in a production assembly service or maintenance enviroment
US20040051745A1 (en) * 2002-09-18 2004-03-18 Ullas Gargi System and method for reviewing a virtual 3-D environment
US20040164897A1 (en) * 2003-02-24 2004-08-26 Simon Treadwell Apparatus and method for recording real time movements and experiences for subsequent replay in a virtual reality domain
US20050060719A1 (en) * 2003-09-12 2005-03-17 Useractive, Inc. Capturing and processing user events on a computer system for recording and playback
US20090210483A1 (en) * 2008-02-15 2009-08-20 Sony Ericsson Mobile Communications Ab Systems Methods and Computer Program Products for Remotely Controlling Actions of a Virtual World Identity
US20100205529A1 (en) * 2009-02-09 2010-08-12 Emma Noya Butin Device, system, and method for creating interactive guidance with execution of operations
US20120304059A1 (en) * 2011-05-24 2012-11-29 Microsoft Corporation Interactive Build Instructions
US20140282008A1 (en) * 2011-10-20 2014-09-18 Koninklijke Philips N.V. Holographic user interfaces for medical procedures
US20130203026A1 (en) * 2012-02-08 2013-08-08 Jpmorgan Chase Bank, Na System and Method for Virtual Training Environment
US20160364321A1 (en) * 2014-02-20 2016-12-15 Hewlett Packard Enterprise Development Lp Emulating a user performing spatial gestures
US20150260474A1 (en) * 2014-03-14 2015-09-17 Lineweight Llc Augmented Reality Simulator
US20150310751A1 (en) * 2014-04-24 2015-10-29 Indu Tolia Augmented reality assisted education content creation and management
US20170213473A1 (en) * 2014-09-08 2017-07-27 SimX, Inc. Augmented and virtual reality simulator for professional and educational training
US9767613B1 (en) * 2015-01-23 2017-09-19 Leap Motion, Inc. Systems and method of interacting with a virtual object
US20170157512A1 (en) * 2015-12-06 2017-06-08 Sliver VR Technologies, Inc. Methods and systems for computer video game streaming, highlight, and replay
US20180276895A1 (en) * 2017-03-27 2018-09-27 Global Tel*Link Corporation Personalized augmented reality in a controlled environment

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210264810A1 (en) * 2018-06-26 2021-08-26 Rebecca Johnson Method and system for generating a virtual reality training session
US10726631B1 (en) * 2019-08-03 2020-07-28 VIRNECT inc. Augmented reality system and method with frame region recording and reproduction technology based on object tracking
CN110782722A (en) * 2019-09-30 2020-02-11 南京浩伟智能科技有限公司 Teaching system and teaching method based on AR system
US11894130B2 (en) 2019-12-26 2024-02-06 Augmenticon Gmbh Pharmaceutical manufacturing process control, support and analysis
US20220245051A1 (en) * 2021-02-01 2022-08-04 Raytheon Company Annotated deterministic trace abstraction for advanced dynamic program analysis
US11580006B2 (en) * 2021-02-01 2023-02-14 Raytheon Company Annotated deterministic trace abstraction for advanced dynamic program analysis

Also Published As

Publication number Publication date
US20180357823A1 (en) 2018-12-13
US10748443B2 (en) 2020-08-18
US20180357922A1 (en) 2018-12-13

Similar Documents

Publication Publication Date Title
US20180356878A1 (en) Apparatus and method for recording and replaying interactive content in augmented/virtual reality in industrial automation systems and other systems
US9870346B2 (en) Clickable links within live collaborative web meetings
US20190304188A1 (en) Systems and methods for multi-user virtual reality remote training
EP3635521A1 (en) Apparatus and method for visual-assisted training, collaboration, and monitoring in augmented/virtual reality in industrial automation systems and other systems
CN102508863A (en) On-line courseware making system and making method
Kaarlela et al. Digital twins utilizing XR-technology as robotic training tools
Neumann et al. AVIKOM: towards a mobile audiovisual cognitive assistance system for modern manufacturing and logistics
CN110678827B (en) Apparatus and method for recording and playback of interactive content with augmented/virtual reality in industrial automation systems and other systems
WO2018226452A1 (en) Apparatus and method for assessing and tracking user competency in augmented/virtual reality-based training in industrial automation systems and other systems
Saliah-Hassane et al. Special session—online laboratories in engineering education: innovation, disruption, and future potential
US11721075B2 (en) Systems and methods for integrating and using augmented reality technologies
Horvath et al. Overview of modern teaching equipment that supports distant learning
US20220150290A1 (en) Adaptive collaborative real-time remote remediation
CN112462929A (en) Recording and playback apparatus and method for interactive contents in augmented virtual reality
Ehlenz et al. Open Research Tools for the Learning Technologies Innovation Lab.
Martin et al. ICT needs and trends in engineering education
EP3398344A1 (en) System and method for presenting video and associated documents and for tracking viewing thereof
NAIDU et al. HTML5 BASED E-LEARNING AUTHORING TO FACILITATE INTERACTIVE LEARNING DURING COVID-19 PANDEMIC: A REVIEW
Naidu et al. A REVIEW OF IMPLEMENTATION OF HTML5 BASED PLATFORMS TO FACILITATE INTERACTIVE ONLINE LEARNING DURING COVID-19 PANDEMIC
Reis et al. Case Study: The Impact of VR & AR Simulation Project to Decrease Human Exposure in Critical Environments
Rodriguez et al. Developing a mobile learning environment: An axiomatic approach
Charachristos et al. Enhancing collaborative learning and management tasks through pervasive computing technologies
Victor et al. Cracking the mobile learning code: xAPI and cmi5
Hasegawa et al. A Ubiquitous Lecture Archive Learning Platform with Note-Centered Approach
Brunelle Lab 2-Speed Notes Prototype Product Specification

Legal Events

Date Code Title Description
AS Assignment

Owner name: HONEYWELL INTERNATIONAL INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DUDEKULA, SUBHAN BASHA;AHAMED, ARIF SHUAIL;KONIKI, RAMESH BABU;AND OTHERS;SIGNING DATES FROM 20180329 TO 20180330;REEL/FRAME:045401/0895

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION