WO2018222225A1 - Semantic information model and enhanced reality interface for workforce and asset management - Google Patents

Semantic information model and enhanced reality interface for workforce and asset management Download PDF

Info

Publication number
WO2018222225A1
WO2018222225A1 PCT/US2017/067775 US2017067775W WO2018222225A1 WO 2018222225 A1 WO2018222225 A1 WO 2018222225A1 US 2017067775 W US2017067775 W US 2017067775W WO 2018222225 A1 WO2018222225 A1 WO 2018222225A1
Authority
WO
WIPO (PCT)
Prior art keywords
asset
semantic
ar device
information
particular
Prior art date
Application number
PCT/US2017/067775
Other languages
French (fr)
Inventor
Mareike KRITZLER
Iori MIZUTANI
Elvia Kimberly GARCIA GARCIA
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201762513492P priority Critical
Priority to US62/513,492 priority
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Publication of WO2018222225A1 publication Critical patent/WO2018222225A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Product repair or maintenance administration

Abstract

Systems, methods, and computer-readable media are disclosed for utilizing semantic technologies to provide an integrated information source that is accessible and capable of being manipulated via an augmented-reality (AR) device. The AR device provides a user interface via which a user provides gesture-based or voice-based input to access and manipulate data in disparate and independent data sources using a semantic information model. User manipulations to the user interface provided by the AR device are reflected in the underlying semantic information model and the underlying data values associated with entities represented in the semantic information model. The AR device also provides a user interface for performing virtual testing of replacement assets with respect to a physical machine in a real-world environment. In addition, the AR device enables the user to first manipulate a virtual machine and then have a real machine mimic the simulated manipulation/movements.

Description

SEMANTIC INFORMATION MODEL AND ENHANCED REALITY INTERFACE FOR WORKFORCE AND ASSET MANAGEMENT

CROSS-REFERENCE TO RELATED APPLICATION

[01] This application is a NON-PROVISIONAL of and claims the benefit to

United States Provisional Application Serial Number 62/513,492, filed June 1, 2017, which is incorporated by reference herein in its entirety.

BACKGROUND

[02] Industrial managers, such as production or factory managers, are tasked with asset management and workforce management duties which typically require accessing disparate and independent information sources. Such systems can contain data and information about a workforce, assets, logistics of such assets, and an interface to a procurement system. While asset and workforce management systems exist for managing access to such varied information sources, such conventional systems suffer from a number of drawbacks, technical solutions to which are discussed herein.

BRIEF DESCRIPTION OF THE DRAWINGS

[03] The detailed description is set forth with reference to the accompanying drawings. The drawings are provided for purposes of illustration only and merely depict example embodiments of the disclosure. The drawings are provided to facilitate understanding of the disclosure and shall not be deemed to limit the breadth, scope, or applicability of the disclosure. In the drawings, the left-most digit(s) of a reference numeral identifies the drawing in which the reference numeral first appears. The use of the same reference numerals indicates similar, but not necessarily the same or identical components. However, different reference numerals may be used to identify similar components as well. Various embodiments may utilize elements or components other than those illustrated in the drawings, and some elements and/or components may not be present in various embodiments. The use of singular terminology to describe a component or element may, depending on the context, encompass a plural number of such components or elements and vice versa.

[04] FIG. 1 is a hybrid system component/data flow diagram illustrating an augmented-reality (AR)-based integrated information system in accordance with one or more example embodiments of the disclosure.

[05] FIG. 2 is a process flow diagram of an illustrative method for providing an AR-based user interface communicatively coupled to an integrated information system backend to enable asset and workforce management in accordance with one or more example embodiments of the disclosure.

[06] FIG. 3 is a process flow diagram of an illustrative method for providing an AR-based user interface communicatively coupled to an integrated information system backend to enable virtual testing of potential replacement assets in accordance with one or more example embodiments of the disclosure.

[07] FIG. 4 is a schematic diagram of an illustrative networked architecture in accordance with one or more example embodiments of the disclosure.

DETAILED DESCRIPTION

[08] This disclosure relates to, among other things, devices, servers, systems, methods, computer-readable media, techniques, and methodologies for utilizing semantic technologies to provide an integrated information source that is accessible and capable of being manipulated via an augmented-reality (AR) interface. While example embodiments of the disclosure will be described herein in connection with augmented- reality technologies, it should be appreciated that any of a variety of enhanced-reality technologies may be employed including, without limitation, virtual-reality technologies, mixed-reality technologies, or the like. In accordance with example embodiments of the disclosure, an AR device, such as a head-mounted display, may provide a user interface via which a user may provide gesture-based or voice-based input to access and manipulate data in disparate and independent data sources based on queries defined with respect to a semantic information model. User manipulations in the user interface provided by the AR device can automatically be reflected by changes to the data values defined by the underlying semantic information model. In addition, the AR device may provide a user interface for performing virtual testing of replacement assets with respect to a physical machine in a real-world environment.

[09] Industrial managers, such as production or factory managers, are in charge of workforce and asset management. With respect to workforce management, managers must ensure, for example, that geographic workforce distribution and workforce certifications are adequate. As such, managers need to know the locations of members of their workforce and the current context of service technicians in the field to ensure sufficient workforce coverage for the various facilities being managed.

[010] Managers are also responsible for the status of the operational industrial machinery and their assets in a factory. Machines can be equipped with a variety of tools for production processes. For example, robots or robotic arms can mount a vast variety of grippers depending on the tasks they need to perform. In order to manage assets, such as grippers, managers need to ensure that the assets do not exceed their lifecycle and that they will be maintained, service, or replaced on-time. Consequently, managers need to know the status of deployed machines and are responsible for assigning tasks such as repair, maintenance, and/or replacement to their workforce according to the equipment's status and lifecycle information of the assets given data such as throughput, workload, and depreciation rate.

[Oil] In order to effectively perform workforce management, a manager needs access to information such as workforce locations, workload, training status, qualifications, level of expertise, access permissions, and so forth. In order to effectively perform asset management, a manager needs access to asset lifecycle and inventory information. In conventional systems, accessing such information requires using different software tools and solutions to access disparate and independent data repositories or other sources. As a result, such conventional solutions suffer from a number of drawbacks.

[012] First, in conventional solutions, in order to know the current context of the workforce, a manager needs to access and query different information sources and manually match attributes of interest such as the location of an asset and the location of a worker. Further, to manipulate the data, a manager must alter the data at its source, which may require updating data stored in multiple disparate information sources. In addition, in order to access such different information sources, conventional solutions require managers and technicians to use different kinds of software tools at data terminals that may not be located in close proximity to one another or that may not even be at the location at which a task needs to be performed.

[013] Other drawbacks associated with conventional solutions relate to asset management. For instance, with conventional solutions, if inventory needs to be accessed, such as for a replacement task, only parts that are physically available in the inventory at a given time and location can be examined. Thus, conventional solutions do not permit feasibility tests to be conducted in a manner that ensures that a part will actually meet the tangible requirements in a specific setup where real-world constraints are taken into consideration. While asset testing may be conducted in a purely virtual setting in conventional solutions, such purely virtual testing does not ensure that a part will satisfy real-world constraints.

[014] Example embodiments of the disclosure address these and other drawbacks associated with conventional solutions by utilizing semantic technologies to provide an integrated information source that is accessible and capable of being manipulated via an AR device. Thus, example embodiments of the disclosure provide an intuitive and easy- to-use interface via the AR device that enables access to and manipulation of distributed workforce and asset data from multiple independent information sources while also providing the capability to generate new information and knowledge through interaction with virtual and real-world objects. The AR device provides an interface that makes information accessible intuitively and allows for tangible information manipulation. In addition, the AR device provides an interface for the simulation of the operation of virtual assets in connection with a real-world machine. As such, example embodiments of the disclosure provide, among other things, the matching of workforce and work tasks occurring at different locations while taking into account differences in workforce training and qualifications as well as the correct matching of replacement assets with a real-world machine setup in order to ensure that the replacement assets fit the real-world environment.

[015] FIG. 1 is a hybrid system component/data flow diagram illustrating an AR- based integrated information system 100. FIG. 2 is a process flow diagram of an illustrative method 200 for providing an AR-based user interface communicatively coupled to an integrated information system backend to enable asset and workforce management. FIG. 3 is a process flow diagram of an illustrative method 300 for providing the AR-based user interface to enable virtual testing of potential replacement assets. FIGS. 2 and 3 will each be described in conjunction with FIG. 1 hereinafter.

[016] Each operation of the method 200 or the method 300 may be performed by one or more components that may be implemented in any combination of hardware, software, and/or firmware. In certain example embodiments, one or more of these component(s) may be implemented, at least in part, as software and/or firmware that contains or is a collection of one or more program modules that include computer- executable instructions that when executed by a processing circuit cause one or more operations to be performed. A system or device described herein as being configured to implement example embodiments of the invention may include one or more processing circuits, each of which may include one or more processing units or nodes. Computer- executable instructions may include computer-executable program code that when executed by a processing unit may cause input data contained in or referenced by the computer-executable program code to be accessed and processed to yield output data. [017] FIG. 1 is a hybrid system component/data flow diagram illustrating an AR- based integrated information system 100. The system 100 may include one or more servers 102 (hereinafter referred to as the server 102 for ease of explanation). The server 102 may be located remotely from an AR device 104. The device 104 may be a wearable device having an integrated projection means and/or may be capable of interacting with an external projector to project virtual objects onto a real- world environment. For example, the AR device 104 may be a head-mounted display configured to project virtual objects onto a real- world environment in the user's field-of-view. The virtual objects can be manipulated with respect to the real- world environment. The device 104 may be communicatively coupled to the server 102 via one or more networks 118 such that input received at the device 104 may be processed at the server 102 to generate output which may then be presented to a user (e.g., a manager or service technician) via the device 104.

[018] The server 102 may be configured to access multiple independent data sources such as, for example, multiple independent datastores 106. The multiple data sources may store a variety of types of data such as, for example, workforce entity data 108 and asset entity data 110. The workforce entity data 108 may include workforce entities (e.g., workerl, worker2, etc.) with corresponding workforce locations, workload information, training status information, information indicating qualification and/or expertise levels, information indicating access permissions, or the like. The asset entity data 110 may include asset entities (e.g., gripperl, gripper2, etc.) with corresponding asset attribute information (e.g., depreciation rate, purchase date, location of asset, lifecycle, or any other suitable attribute); inventory status information; replacement asset information; and so forth. In certain example embodiments, the workforce entity data 108 and/or the asset entity data 110 may include any type of data that follows the semantic representation defined by a semantic information model 114, which will be described in more detail hereinafter. [019] Referring now to FIG. 2 in conjunction with FIG. 1, at block 202 of the method 200, a user request for asset information may be received at the AR device 104. A manager, for example, may initiate the request for asset information by providing gesture-based input to an interface of the AR device 104 or by providing voice-based commands. Then, at block 204 of the method 200, the AR device 104 may trigger a query to the server 102, or more specifically, a semantic processing module 112 residing and executing on the server 102. In particular, in response to receiving a request 120 for asset information from the AR device 104, the semantic processing module 112 may then query the entities stored in the datastore(s) 106 to retrieve the requested asset information. For example, the semantic processing module 112 may contain a set of predefined queries 116 that the semantic processing module 112 can use to query the datastore(s) 106. An example query 116 can, for instance, select all grippers that are at the same location as worker A. While the queries 116 may be predefined, the results obtained from the datastore(s) 106 may change depending on the status of the data (e.g., if worker A moves from location X to location Y, the resulting grippers would likely be different).

[020] The semantic processing module 112 may utilize the semantic information model 114 to select the appropriate query 116 to retrieve the desired asset information and may send a response 122 to the AR device 104 containing the asset information, which may be received by the AR device 104 at block 206 of the method 200. The AR device 104 may then render the asset information via a user interface of the AR device 104 at block 208 of the method 200. The hierarchical nature of the semantic information model 114 allows obtaining information at different levels of abstraction. For example, queries can be made to a specific type of asset (e.g., a gripper) and its attributes. However, additional asset information may also be obtained in response to the request 120 by moving within the hierarchy of the semantic information model 114. For example, by moving a level up in the hierarchy, information from a parent category that a particular asset belongs to can be obtained (e.g., robot end effector for a gripper). Similarly, by moving horizontally in the hierarchy, information about entities to which the particular asset is related can be obtained (e.g., robot arm that a gripper is attached to).

[021] The asset information rendered at block 208 may include attribute information for one or more assets. The asset information may relate to multiple inventory items corresponding to the same type of asset or may relate to all or some other subset of assets. The rendered asset information (which may form part of rendered information 124) may include a description of attributes associated with various assets. In addition, the AR device 104 may provide a user interface via which a user (e.g., a manager or technician) can provide commands to sort or filter the asset information based on various attributes that are present in the semantic information model 114 including, without limitation, asset identifier, depreciation rate, lifecycle, purchase date, duration that the asset has been in operation, and so forth. In certain example embodiments, the asset information may be color-coded or rendered with other indicia to indicate an urgency with which an asset needs replacement or repair. Assets that have surpassed or are about to reach their recommended lifecycle may be rendered with indicia indicating a more urgent need for replacement or repair than other assets that have a longer lifecycle remaining.

[022] At block 210 of the method 200, user input corresponding to a user selection of a particular asset may be received at the AR device 104. This may, in turn, trigger another query from the AR device 104 to the semantic processing module 112 for information relating to the selected asset, which may be received by the AR device 104 from the semantic processing module 112 and rendered via the a user interface of the AR device 104. The asset that is selected may be an asset that is indicated as requiring urgent repair or replacement. At block 212 of the method 200, user input corresponding to a user selection of a task to be performed on the selected asset may be received at the AR device 104. In certain example embodiments, upon selection of a particular asset, a set of selectable task options may be rendered within a user interface provided by the AR device 104. The task options may include, for example, replacement of the asset, disposal of the asset, maintenance of the asset, or the like. [023] Once a manager has analyzed the status of assets (e.g., the rendered asset information 124) and has selected a task to be performed on a selected asset (e.g., replacement, disposal, or maintenance), a user interface may be provided via the AR device 104 at block 214 of the method 200. The user interface may provide the capability for the manager to view and manipulate workforce information in conjunction with the asset information. For example, a virtual map may be displayed via the user interface that allows a manager to determine the location of the selected asset visually on the virtual map. The user interface may also allow the manager to visualize and manipulate workforce information on the same virtual space in which the asset information is rendered. The user interface provides the capability to view the distribution of the workforce and their status (such as their skills) in order to assign the task to a particular workforce member.

[024] In certain example embodiments, the semantic processing module 112 may determine, at block 216 of the method 200, that no workforce member is co-located with the selected asset, and the user interface provided by AR device 104 may indicate this. In such a scenario, the manager may be provided with the capability to manipulate the user interface to provide input, at block 218 of the method 200 to re-assign a worker to a facility containing the asset. Alternatively, the manager may manipulate the user interface to provide input to assign training to a worker who is already co-located with the selected asset. Then, at block 220 of the method 200, user input may be received via the user interface provided by the AR device 104 to assign the selected task to the worker who has been designated for re-assignment or training. On the other hand, in response to a positive determination at block 216, the method 200 may proceed directly to block 220, where user input may be received at the user interface provided by the AR device 104, where the user input corresponds to selection of a worker who is currently co-located with the selected asset to perform the selected task on the selected asset. [025] In certain example embodiments, the data manipulation that occurs via input to the user interface provided by the AR device 104 may be reflected in the semantic information model 114. In particular, any action that a user may perform via the user interface of the AR device 104 can cause changes to the entities defined by semantic information model 114 and updates to the underlying data values. Thus, the AR device 104 interface allows a user to intuitively manipulate the entities of the underlying semantic information model 114 and perform complex semantic queries based thereon without being exposed to the semantic queries. For instance, the semantic processing module 112 may update the value of entities of the semantic information model 114 based at least in part on the assignment of the selected worker to perform the selected task on the selected asset. More specifically, the semantic processing module 112 may generate a first object property in the semantic information model 114 (which corresponds to a relationship between entities defined by the common semantic information model 114) that associates a first semantic entity representing the selected task with a second semantic entity representing the asset on which the task is to be performed. In addition, the semantic processing module 112 may generate a second object property in the semantic information model 114 that associates the first semantic entity with a third semantic entity representing the worker. Thus, in this manner, a user is capable of altering the underlying data based on the semantic information model 114 via input provided to a user interface provided by the AR device 104.

[026] Example embodiments of the disclosure also provide support to the worker (e.g., a service technician) who has been assigned a task with respect to a particular asset (e.g., replacement of gripper that attaches to a robotic arm). Referring now to FIG. 3 in conjunction with FIG. 1, at block 302 of the method 300, user input corresponding to a request for information relating to an asset may be received from a worker at a user interface of the AR device 104. The asset may be one with respect to which the worker has been assigned a task. This may then trigger a query for the information relating to the asset from the AR device 104 to the semantic processing module 112. The semantic processing module 112 may then query the datastore(s) 6 for the information and return a response to the query to the AR device 104. The AR device 104 may receive the response containing the requested information at block 306 of the method 300. The semantic processing module 112 may utilize the semantic information model 114 to determine the appropriate query 116 to select to retrieve the requested asset information.

[027] At block 308 of the method 300, the information relating to the asset may be rendered in a user interface provided by the AR device 104. The rendered information 124 may include, for example, inventory status information for the asset. In certain example embodiments, the inventory status information may only be shown for compatible assets. For example, only inventory status information for grippers that are compatible with the robotic arm present in the real-world environment may be shown. Knowledge regarding compatible grippers may be derived from the modeling of entities in the semantic information model 114. In certain example embodiments, the rendered information 124 may also include virtual replacement assets that can be manipulated by a user within a real-world environment. The virtual replacement assets may correspond to actual assets that can serve as a replacement for the selected asset.

[028] At block 310 of the method 300, user input may be received at the user interface provided by the AR device 104. The user input may correspond to user manipulations of the virtual replacement assets with respect to a real-world environment that includes the machine or machine part to which actual replacement assets corresponding to the virtual replacement assets can be physically attached. The user input received at block 310 may further include user manipulations of the virtual replacement assets with respect to a virtual machine corresponding to the actual physical machine.

[029] The user input received at block 310 can be processed by the virtual testing module 126 that may reside and execute on the server 102 to cause the virtual replacement assets to be rendered, at block 312 of the method 300, in association with a physical machine in the real- world environment. In addition, at block 314 of the method 300, the virtual testing module 126 may be executed to cause the physical machine to mimic the movements of the virtual machine that result from the user manipulations of the user interface provided by the AR device 104. In this manner, the physical machine can be brought to a same position in the real-world environment as the virtual machine in the AR environment.

[030] In the above manner, example embodiments of the disclosure provide a user (e.g., a worker assigned a replacement task) to perform feasibility tests to virtually test different assets (some of which may not currently be in inventory) to ensure that they are suitable for use in connection with a real-world setup. For instance, a worker can virtually test different grippers to ensure that a selected gripper is suitable for a particular robotic arm within the constraints of a particular real- world setup (e.g., a picking task considering the obstacles present in the real-world environment). In certain example embodiments, if a virtual replacement asset fails the feasibility testing, the semantic processing module 112 may update the semantic information model 114 to reflect this. In particular, the semantic processing module 112 may disassociate a particular semantic entity representing an actual replacement asset corresponding to the virtual replacement asset that failed the testing from a semantic entity representing the machine with which the actual replacement asset is associated.

[031] One or more illustrative embodiments of the disclosure have been described above. The above-described embodiments are merely illustrative of the scope of this disclosure and are not intended to be limiting in any way. Accordingly, variations, modifications, and equivalents of embodiments disclosed herein are also within the scope of this disclosure. The above-described embodiments and additional and/or alternative embodiments of the disclosure will be described in detail hereinafter through reference to the accompanying drawings.

[032] FIG. 4 is a schematic diagram of an illustrative networked architecture 400 in accordance with one or more example embodiments of the disclosure. The networked architecture 400 may include one or more servers 402 and one or more AR devices 404. While multiple devices 404 and/or multiple servers 402 may form part of the networked architecture 400, these components will be described in the singular hereinafter for ease of explanation. However, it should be appreciated that any functionality described in connection with the server 402 may be distributed among multiple servers 402 and/or among one or more devices 404. In one or more example embodiments, the server 102 may have the illustrative configuration of server 402 and the AR device 404 may be the AR device 104.

[033] The server 402 may be configured to communicate with the AR device 404 (e.g., a wearable device such as a head- mounted display) via one or more networks 406 which may include, but are not limited to, any one or more different types of communications networks such as, for example, cable networks, public networks (e.g., the Internet), private networks (e.g., frame-relay networks), wireless networks, cellular networks, telephone networks (e.g., a public switched telephone network), or any other suitable private or public packet-switched or circuit-switched networks. Further, the network(s) 406 may have any suitable communication range associated therewith and may include, for example, global networks (e.g., the Internet), metropolitan area networks (MANs), wide area networks (WANs), local area networks (LANs), or personal area networks (PANs). In addition, the network(s) 406 may include communication links and associated networking devices (e.g., link-layer switches, routers, etc.) for transmitting network traffic over any suitable type of medium including, but not limited to, coaxial cable, twisted-pair wire (e.g., twisted-pair copper wire), optical fiber, a hybrid fiber- coaxial (HFC) medium, a microwave medium, a radio frequency communication medium, a satellite communication medium, or any combination thereof.

[034] In an illustrative configuration, the server 402 may include one or more processors (processor(s)) 308, one or more memory devices 310 (generically referred to herein as memory 410), one or more input/output ("I/O") interface(s) 412, one or more network interfaces 314, and data storage 416. The server 402 may further include one or more buses 418 that functionally couple various components of the server 402. These various components will be described in more detail hereinafter. [035] The bus(es) 418 may include at least one of a system bus, a memory bus, an address bus, or a message bus, and may permit exchange of information (e.g., data (including computer-executable code), signaling, etc.) between various components of the server 402. The bus(es) 418 may include, without limitation, a memory bus or a memory controller, a peripheral bus, an accelerated graphics port, and so forth. The bus(es) 418 may be associated with any suitable bus architecture including, without limitation, an Industry Standard Architecture (ISA), a Micro Channel Architecture (MCA), an Enhanced ISA (EISA), a Video Electronics Standards Association (VESA) architecture, an Accelerated Graphics Port (AGP) architecture, a Peripheral Component Interconnects (PCI) architecture, a PCI-Express architecture, a Personal Computer Memory Card International Association (PCMCIA) architecture, a Universal Serial Bus (USB) architecture, and so forth.

[036] The memory 410 of the server 402 may include volatile memory (memory that maintains its state when supplied with power) such as random access memory (RAM) and/or non-volatile memory (memory that maintains its state even when not supplied with power) such as read-only memory (ROM), flash memory, ferroelectric RAM (FRAM), and so forth. Persistent data storage, as that term is used herein, may include non-volatile memory. In certain example embodiments, volatile memory may enable faster read/write access than non-volatile memory. However, in certain other example embodiments, certain types of non-volatile memory (e.g., FRAM) may enable faster read/write access than certain types of volatile memory.

[037] In various implementations, the memory 410 may include multiple different types of memory such as various types of static random access memory (SRAM), various types of dynamic random access memory (DRAM), various types of unalterable ROM, and/or writeable variants of ROM such as electrically erasable programmable read-only memory (EEPROM), flash memory, and so forth. The memory 410 may include main memory as well as various forms of cache memory such as instruction cache(s), data cache(s), translation lookaside buffer(s) (TLBs), and so forth. Further, cache memory such as a data cache may be a multi-level cache organized as a hierarchy of one or more cache levels (LI, L2, etc.).

[038] The data storage 416 may include removable storage and/or non-removable storage including, but not limited to, magnetic storage, optical disk storage, and/or tape storage. The data storage 416 may provide non-volatile storage of computer-executable instructions and other data. The memory 410 and the data storage 416, removable and/or non-removable, are examples of computer-readable storage media (CRSM) as that term is used herein.

[039] The data storage 416 may store computer-executable code, instructions, or the like that may be loadable into the memory 410 and executable by the processor(s) 408 to cause the processor(s) 408 to perform or initiate various operations. The data storage 416 may additionally store data that may be copied to memory 410 for use by the processor(s) 408 during the execution of the computer-executable instructions. Moreover, output data generated as a result of execution of the computer-executable instructions by the processor(s) 408 may be stored initially in memory 410, and may ultimately be copied to data storage 416 for non-volatile storage.

[040] More specifically, the data storage 416 may store one or more operating systems (O/S) 420; one or more database management systems (DBMS) 422; and one or more program modules, applications, engines, computer-executable code, scripts, or the like such as, for example, a semantic processing module 424 and a virtual testing module 426. Any of the components depicted as being stored in data storage 416 may include any combination of software, firmware, and/or hardware. The software and/or firmware may include computer-executable code, instructions, or the like that may be loaded into the memory 410 for execution by one or more of the processor(s) 408 to perform any of the operations described earlier in connection with correspondingly named modules. [041] The networked architecture 400 may further include one or more datastores 428 that may be accessible via the network(s) 406 by the server 402 and/or the AR device 404. The datastore(s) 428 may include the datastore(s) 106 shown in FIG. 1 and any of the data depicted as being stored therein. The datastore(s) 428 may include, but are not limited to, databases (e.g., relational, object-oriented, etc.), file systems, flat files, distributed datastores in which data is stored on more than one node of a computer network, peer-to-peer network datastores, or the like.

[042] The data storage 416 may further store various types of data utilized by components of the server 402 such as, for example, any of the data stored in the datastore(s) 428. Any data stored in the data storage 416 may be loaded into the memory 410 for use by the processor(s) 408 in executing computer-executable code. In addition, any data stored in the datastore(s) 428 may be accessed via the DBMS 422 and loaded in the memory 410 for use by the processor(s) 408 in executing computer-executable code.

[043] The processor(s) 408 may be configured to access the memory 410 and execute computer-executable instructions loaded therein. For example, the processor(s) 408 may be configured to execute computer-executable instructions of the various program modules, applications, engines, or the like of the server 402 to cause or facilitate various operations to be performed in accordance with one or more embodiments of the disclosure. The processor(s) 408 may include any suitable processing unit capable of accepting data as input, processing the input data in accordance with stored computer- executable instructions, and generating output data. The processor(s) 408 may include any type of suitable processing unit including, but not limited to, a central processing unit, a microprocessor, a Reduced Instruction Set Computer (RISC) microprocessor, a Complex Instruction Set Computer (CISC) microprocessor, a microcontroller, an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA), a System-on-a-Chip (SoC), a digital signal processor (DSP), and so forth. Further, the processor(s) 408 may have any suitable microarchitecture design that includes any number of constituent components such as, for example, registers, multiplexers, arithmetic logic units, cache controllers for controlling read/write operations to cache memory, branch predictors, or the like. The microarchitecture design of the processor(s) 408 may be capable of supporting any of a variety of instruction sets.

[044] Referring now to other illustrative components depicted as being stored in the data storage 416, the O/S 320 may be loaded from the data storage 416 into the memory 410 and may provide an interface between other application software executing on the server 402 and hardware resources of the server 402. More specifically, the O/S 420 may include a set of computer-executable instructions for managing hardware resources of the server 402 and for providing common services to other application programs (e.g., managing memory allocation among various application programs). In certain example embodiments, the O/S 420 may control execution of one or more of the program modules depicted as being stored in the data storage 416. The O/S 420 may include any operating system now known or which may be developed in the future including, but not limited to, any server operating system, any mainframe operating system, or any other proprietary or non-proprietary operating system.

[045] The DBMS 422 may be loaded into the memory 410 and may support functionality for accessing, retrieving, storing, and/or manipulating data stored in the memory 410, data stored in the datastore(s) 428, and/or data stored in the data storage 416. The DBMS 422 may use any of a variety of database models (e.g., relational model, object model, etc.) and may support any of a variety of query languages. The DBMS 422 may access data represented in one or more data schemas and stored in any suitable data repository.

[046] Referring now to other illustrative components of the server 402, the input/output (I/O) interface(s) 412 may facilitate the receipt of input information by the server 402 from one or more I/O devices as well as the output of information from the server 402 to the one or more I/O devices. The I/O devices may include any of a variety of components such as a display or display screen having a touch surface or touchscreen; an audio output device for producing sound, such as a speaker; an audio capture device, such as a microphone; an image and/or video capture device, such as a camera; a haptic unit; and so forth. Any of these components may be integrated into the server 402 or may be separate. The I/O devices may further include, for example, any number of peripheral devices such as data storage devices, printing devices, and so forth.

[047] The I/O interface(s) 412 may also include an interface for an external peripheral device connection such as universal serial bus (USB), FireWire, Thunderbolt, Ethernet port or other connection protocol that may connect to one or more networks. The I/O interface(s) 412 may also include a connection to one or more antennas to connect to one or more networks via a wireless local area network (WLAN) (such as Wi- Fi) radio, Bluetooth, and/or a wireless network radio, such as a radio capable of communication with a wireless communication network such as a Long Term Evolution (LTE) network, WiMAX network, 3G network, etc.

[048] The server 402 may further include one or more network interfaces 314 via which the server 402 may communicate with any of a variety of other systems, platforms, networks, devices, and so forth. The network interface(s) 414 may enable communication, for example, with the AR device 404 and/or the datastore(s) 428 via the network(s) 406.

[049] Referring now to the AR device 404, in an illustrative configuration, the AR device 404 may include similar hardware and/or software components as those depicted in connection with the illustrative configuration of the server 402. Further, the AR device 404 may include one or more sensors/sensor interfaces that may include or may be capable of interfacing with any suitable type of sensing device such as, for example, inertial sensors, force sensors, thermal sensors, optical sensors, time-of-flight sensors, and so forth. Example types of inertial sensors may include accelerometers (e.g., MEMS- based accelerometers), gyroscopes, and so forth. In addition, the AR device 404 may include one or more projection elements and a display capable of receiving input and displaying output. [050] It should be appreciated that the program modules, applications, computer- executable instructions, code, or the like depicted in FIG. 4 as being stored in the data storage 416 are merely illustrative and not exhaustive and that processing described as being supported by any particular module may alternatively be distributed across multiple modules or performed by a different module. In addition, various program module(s), script(s), plug-in(s), Application Programming Interface(s) (API(s)), or any other suitable computer-executable code hosted locally on the server 402, the AR device 404, and/or hosted on other computing device(s) accessible via one or more of the network(s) 406, may be provided to support functionality provided by the program modules, applications, or computer-executable code depicted in FIG. 4 and/or additional or alternate functionality. Further, functionality may be modularized differently such that processing described as being supported collectively by the collection of program modules depicted in FIG. 4 may be performed by a fewer or greater number of modules, or functionality described as being supported by any particular module may be supported, at least in part, by another module. In addition, program modules that support the functionality described herein may form part of one or more applications executable across any number of systems or devices in accordance with any suitable computing model such as, for example, a client-server model, a peer-to-peer model, and so forth. In addition, any of the functionality described as being supported by any of the program modules depicted in FIG. 4 may be implemented, at least partially, in hardware and/or firmware across any number of devices.

[051] It should further be appreciated that the server 402 may include alternate and/or additional hardware, software, or firmware components beyond those described or depicted without departing from the scope of the disclosure. More particularly, it should be appreciated that software, firmware, or hardware components depicted as forming part of the server 402 are merely illustrative and that some components may not be present or additional components may be provided in various embodiments. While various illustrative program modules have been depicted and described as software modules stored in data storage 416, it should be appreciated that functionality described as being supported by the program modules may be enabled by any combination of hardware, software, and/or firmware. It should further be appreciated that each of the above- mentioned modules may, in various embodiments, represent a logical partitioning of supported functionality. This logical partitioning is depicted for ease of explanation of the functionality and may not be representative of the structure of software, hardware, and/or firmware for implementing the functionality. Accordingly, it should be appreciated that functionality described as being provided by a particular module may, in various embodiments, be provided at least in part by one or more other modules. Further, one or more depicted modules may not be present in certain embodiments, while in other embodiments, additional modules not depicted may be present and may support at least a portion of the described functionality and/or additional functionality. Moreover, while certain modules may be depicted and described as sub-modules of another module, in certain embodiments, such modules may be provided as independent modules or as sub- modules of other modules.

[052] One or more operations of the method 200 or the method 300 may be performed by a server 402, by an AR device 404, or in a distributed fashion by a server 402 and such a device 404, where the server 402 may have the illustrative configuration depicted in FIG. 4, or more specifically, such operation(s) may be performed by one or more engines, program modules, applications, or the like executable on such device(s). It should be appreciated, however, that such operations may be implemented in connection with numerous other device configurations.

[053] The operations described and depicted in the illustrative methods of FIGS. 2 and 3 may be carried out or performed in any suitable order as desired in various example embodiments of the disclosure. Additionally, in certain example embodiments, at least a portion of the operations may be carried out in parallel. Furthermore, in certain example embodiments, less, more, or different operations than those depicted in FIGS. 2 and 3may be performed. [054] Although specific embodiments of the disclosure have been described, one of ordinary skill in the art will recognize that numerous other modifications and alternative embodiments are within the scope of the disclosure. For example, any of the functionality and/or processing capabilities described with respect to a particular device or component may be performed by any other device or component. Further, while various illustrative implementations and architectures have been described in accordance with embodiments of the disclosure, one of ordinary skill in the art will appreciate that numerous other modifications to the illustrative implementations and architectures described herein are also within the scope of this disclosure. In addition, it should be appreciated that any operation, element, component, data, or the like described herein as being based on another operation, element, component, data, or the like can be additionally based on one or more other operations, elements, components, data, or the like. Accordingly, the phrase "based on," or variants thereof, should be interpreted as "based at least in part on."

[055] Although embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the disclosure is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the embodiments. Conditional language, such as, among others, "can," "could," "might," or "may," unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments could include, while other embodiments do not include, certain features, elements, and/or steps. Thus, such conditional language is not generally intended to imply that features, elements, and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements, and/or steps are included or are to be performed in any particular embodiment. [056] The present disclosure may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.

[057] The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.

[058] Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.

[059] Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field- programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.

[060] Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions. [061] These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.

[062] The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.

[063] The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A method, comprising: receiving, at an augmented-reality (AR) device, first user input corresponding to a request for asset information; triggering, from the AR device, a query for the asset information, wherein the query is customized based at least in part on the first user input; receiving, at the AR device, a response to the query, the response comprising the asset information; rendering the asset information via the AR device; receiving, at the AR device, second user input corresponding to a selection of a particular asset; receiving, at the AR device, third user input corresponding to a selection of a task to be performed on the particular asset; providing, via the AR device, a user interface for viewing and manipulating workforce information in conjunction with the asset information; receiving, at the user interface, fourth user input corresponding to user manipulations of the user interface; determining that the fourth user input corresponds to assignment of a particular worker to perform the task on the particular asset; updating values of entities in a semantic information model to associate a first semantic entity representing the task with a second semantic entity representing the particular asset; and updating the values of entities in the semantic information model to associate the first semantic entity with a third semantic entity representing the particular worker.
2. The method of claim 1, wherein updating the values of entities in the semantic information model comprises: associate a first object property in the semantic information model that relates the first semantic entity with the second semantic entity; and associate a second object property in the semantic information model that relates the first semantic entity with the third semantic entity.
3. The method of claim 1, further comprising: rendering, via the user interface, an indication that no workers are currently assigned to a facility or location that contains the particular asset; receiving, via the user interface, fifth user input to re-assign the particular worker to the facility or location; and updating the semantic information model to associate the third semantic entity with a fourth semantic entity representing the facility.
4. The method of claim 1, wherein rendering the asset information via the AR device comprises rendering a set of sortable asset attributes.
5. The method of claim 4, further comprising: receiving, at the user interface, fifth user input to sort the asset information with respect to a particular asset attribute; sorting the asset information with respect to the particular asset attribute to generate sorted asset information; and rendering, via the user interface, the sorted asset information.
6. The method of claim 1, further comprising: receiving, at the AR device, fifth user input corresponding to user manipulations of virtual replacement assets for the particular asset; and rendering, via the AR device, the virtual replacement assets in association with a physical machine in a real-world environment to enable virtual testing of the virtual replacement assets.
7. The method of claim 6, further comprising: receiving, at the AR device, sixth user input indicating that a particular virtual replacement asset failed the virtual testing; and updating the values of entities in the semantic information model to disassociate a fourth semantic entity representing an actual replacement asset corresponding to the particular virtual replacement asset from a fifth semantic entity representing the physical machine.
8. A system, comprising: at least one memory storing computer-executable instructions; and at least one processor configured to access the at least one memory and execute the computer-executable instructions to: receive, at an augmented-reality (AR) device, first user input corresponding to a request for asset information; trigger, from the AR device, a query for the asset information, wherein the query is customized based at least in part on the first user input; receive, at the AR device, a response to the query, the response comprising the asset information; render the asset information via the AR device; receive, at the AR device, second user input corresponding to a selection of a particular asset; receive, at the AR device, third user input corresponding to a selection of a task to be performed on the particular asset; provide, via the AR device, a user interface for viewing and manipulating workforce information in conjunction with the asset information; receive, at the user interface, fourth user input corresponding to user manipulations of the user interface; determine that the fourth user input corresponds to assignment of a particular worker to perform the task on the particular asset; update values of entities in a semantic information model to associate a first semantic entity representing the task with a second semantic entity representing the particular asset; and update the values of entities in the semantic information model to associate the first semantic entity with a third semantic entity representing the particular worker.
9. The system of claim 8, wherein at least one processor is configured to update the values of entities in the semantic information model by executing the computer-executable instructions to: associate a first object property in the semantic information model that relates the first semantic entity with the second semantic entity; and associate a second object property in the semantic information model that relates the first semantic entity with the third semantic entity.
10. The system of claim 8, wherein the at least one processor is further configured to execute the computer-executable instructions to: render, via the user interface, an indication that no workers are currently assigned to a facility or location that contains the particular asset; receive, via the user interface, fifth user input to re-assign the particular worker to the facility or location; and update the values of entities in the semantic information model to associate the third semantic entity with a fourth semantic entity representing the facility.
11. The system of claim 8, wherein at least one processor is configured to render the asset information via the AR device by executing the computer-executable instructions to render a set of sortable asset attributes.
12. The system of claim 11, wherein at least one processor is further configured to execute the computer-executable instructions to: receive, at the user interface, fifth user input to sort the asset information with respect to a particular asset attribute; sort the asset information with respect to the particular asset attribute to generate sorted asset information; and render, via the user interface, the sorted asset information.
13. The system of claim 8, wherein at least one processor is further configured to execute the computer-executable instructions to: receive, at the AR device, fifth user input corresponding to user manipulations of virtual replacement assets for the particular asset; and render, via the AR device, the virtual replacement assets in association with a physical machine in a real-world environment to enable virtual testing of the virtual replacement assets.
14. The system of claim 13, wherein at least one processor is further configured to execute the computer-executable instructions to: receive, at the AR device, sixth user input indicating that a particular virtual replacement asset failed the virtual testing; and update the values of entities in the semantic information model to disassociate a fourth semantic entity representing an actual replacement asset corresponding to the particular virtual replacement asset from a fifth semantic entity representing the physical machine.
15. A computer program product comprising a storage medium readable by a processing circuit, the storage medium storing instructions executable by the processing circuit to cause a method to be performed, the method comprising: receiving, at an augmented-reality (AR) device, first user input corresponding to a request for asset information; triggering, from the AR device, a query for the asset information, wherein the query is customized based at least in part on the first user input; receiving, at the AR device, a response to the query, the response comprising the asset information; rendering the asset information via the AR device; receiving, at the AR device, second user input corresponding to a selection of a particular asset; receiving, at the AR device, third user input corresponding to a selection of a task to be performed on the particular asset; providing, via the AR device, a user interface for viewing and manipulating workforce information in conjunction with the asset information; receiving, at the user interface, fourth user input corresponding to user manipulations of the user interface; determining that the fourth user input corresponds to assignment of a particular worker to perform the task on the particular asset; updating the values of entities in a semantic information model to associate a first semantic entity representing the task with a second semantic entity representing the particular asset; and updating the values of entities in the semantic information model to associate the first semantic entity with a third semantic entity representing the particular worker.
16. The computer program product of claim 15, further comprising: rendering, via the user interface, an indication that no workers are currently assigned to a facility or location that contains the particular asset; receiving, via the user interface, fifth user input to re-assign the particular worker to the facility or location; and updating the values of entities in the semantic information model to associate the third semantic entity with a fourth semantic entity representing the facility.
17. The computer program product of claim 15, wherein rendering the asset information via the AR device comprises rendering a set of sortable asset attributes, and wherein the method further comprises: receiving, at the user interface, fifth user input to sort the asset information with respect to a particular asset attribute; sorting the asset information with respect to the particular asset attribute to generate sorted asset information; and rendering, via the user interface, the sorted asset information.
18. The computer program product of claim 15, the method further comprising: receiving, at the AR device, fifth user input corresponding to user manipulations of virtual replacement assets for the particular asset; and rendering, via the AR device, the virtual replacement assets in association with a physical machine in a real-world environment to enable virtual testing of the virtual replacement assets.
19. The computer program product of claim 18, the method further comprising: receiving, at the AR device, sixth user input indicating that a particular virtual replacement asset failed the virtual testing; and updating the values of entities in the semantic information model to disassociate a fourth semantic entity representing an actual replacement asset corresponding to the particular virtual replacement asset from a fifth semantic entity representing the physical machine.
20. The computer program product of claim 15, the method further comprising: receiving, at the AR device, sixth user input corresponding to user manipulations of a virtual machine in an AR environment; and causing the physical machine to mimic movements of the virtual machine that result from the user manipulations of the virtual machine such that the physical machine is brought to a same position in the real-world environment as the virtual machine in the AR environment.
PCT/US2017/067775 2017-06-01 2017-12-21 Semantic information model and enhanced reality interface for workforce and asset management WO2018222225A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201762513492P true 2017-06-01 2017-06-01
US62/513,492 2017-06-01

Publications (1)

Publication Number Publication Date
WO2018222225A1 true WO2018222225A1 (en) 2018-12-06

Family

ID=61003384

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/067775 WO2018222225A1 (en) 2017-06-01 2017-12-21 Semantic information model and enhanced reality interface for workforce and asset management

Country Status (1)

Country Link
WO (1) WO2018222225A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2422234A (en) * 2004-12-10 2006-07-19 Fisher Rosemount Systems Inc Wireless handheld communicator in a process control environment
EP2783812A2 (en) * 2013-03-18 2014-10-01 Kabushiki Kaisha Yaskawa Denki Robot device and method for manufacturing an object
US20160158937A1 (en) * 2014-12-08 2016-06-09 Fanuc Corporation Robot system having augmented reality-compatible display
US20160257000A1 (en) * 2015-03-04 2016-09-08 The Johns Hopkins University Robot control, training and collaboration in an immersive virtual reality environment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2422234A (en) * 2004-12-10 2006-07-19 Fisher Rosemount Systems Inc Wireless handheld communicator in a process control environment
EP2783812A2 (en) * 2013-03-18 2014-10-01 Kabushiki Kaisha Yaskawa Denki Robot device and method for manufacturing an object
US20160158937A1 (en) * 2014-12-08 2016-06-09 Fanuc Corporation Robot system having augmented reality-compatible display
US20160257000A1 (en) * 2015-03-04 2016-09-08 The Johns Hopkins University Robot control, training and collaboration in an immersive virtual reality environment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Similar Documents

Publication Publication Date Title
JP2015529361A (en) Centralized information technology infrastructure management with generic object instances
EP3077926B1 (en) Pattern matching across multiple input data streams
US20140173618A1 (en) System and method for management of big data sets
JP5806049B2 (en) point cloud generation system
US8806441B2 (en) Static code analysis
Ge et al. An iterative approach for development of safety-critical software and safety arguments
US9760635B2 (en) Dynamic search engine for an industrial environment
US8726236B2 (en) Determining context specific content
GB2523338A (en) Testing a virtualised network function in a network
JP2017529593A (en) Placement policy-based allocation of computing resources
CN104950741B (en) For the configuration management interface of the multi-controller of system connection
JP2013117959A (en) Simulation and visualization for project planning and management
US9223610B2 (en) Management of virtual machine snapshots
US9838844B2 (en) Using augmented reality to assist data center operators
US8332431B2 (en) Configuration information management apparatus, configuration information management program, and configuration information management method
US9170119B2 (en) Method and system for dynamically adapting user interfaces in vehicle navigation systems to minimize interaction complexity
US20140146038A1 (en) Augmented display of internal system components
US20150178050A1 (en) Customer Tailored Release Master Plan Generation for Hybrid Networked Solutions
CN104102760B (en) Positioning system for three-dimensional visualization
US10198562B2 (en) Detecting and tracking virtual containers
CN108141380A (en) Network-based resource distribution finds service
US9612821B2 (en) Predicting the success of a continuous software deployment pipeline
US10061481B2 (en) Methods and devices for visually querying an aircraft based on an area of an image
US20170048314A1 (en) Migrating cloud resources
US10462210B2 (en) Techniques for automated installation, packing, and configuration of cloud storage services

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17832140

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE