WO2011135968A1 - 情報処理システム、情報処理方法及びプログラム - Google Patents
情報処理システム、情報処理方法及びプログラム Download PDFInfo
- Publication number
- WO2011135968A1 WO2011135968A1 PCT/JP2011/057959 JP2011057959W WO2011135968A1 WO 2011135968 A1 WO2011135968 A1 WO 2011135968A1 JP 2011057959 W JP2011057959 W JP 2011057959W WO 2011135968 A1 WO2011135968 A1 WO 2011135968A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- repair work
- fault repair
- augmented reality
- failure
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/07—Responding to the occurrence of a fault, e.g. fault tolerance
- G06F11/0703—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
- G06F11/0706—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation the processing taking place on a specific hardware platform or in a specific software environment
- G06F11/0727—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation the processing taking place on a specific hardware platform or in a specific software environment in a storage system, e.g. in a DASD or network based storage system
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/07—Responding to the occurrence of a fault, e.g. fault tolerance
- G06F11/0703—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
- G06F11/079—Root cause analysis, i.e. error or fault diagnosis
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
Definitions
- the present invention relates to a technology that can superimpose and display real space video and computer graphics image data.
- Patent Document 1 there is known a technique for simplifying a removal operation using Augmented Reality (hereinafter referred to as AR) for a paper jam in a copying machine. Specifically, when a paper is not detected by a predetermined sensor within a predetermined time, it is considered that a paper jam has occurred, and an internal image of the copier and an operation procedure prepared in advance in association with the paper jam occurrence position are displayed. It is displayed superimposed. As a result, the paper jam occurrence location of the copying machine can be easily recognized, and the removal operation can be easily guided.
- AR is a technique for providing supplementary information by superimposing and displaying computer-managed data (hereinafter referred to as computer graphics image data) such as characters, graphics, still images, and moving images in an actual environment.
- Patent Document 1 is a case in which a failure that occurs in advance is assumed and countermeasures against the assumed failure are defined in advance. For example, a new failure may occur by combining a plurality of failure factors. In this case, the technique disclosed in Patent Document 1 cannot provide guidance for optimal failure repair work.
- an object of the present invention is to provide work support by guiding an optimal failure repair work even when an unexpected failure occurs.
- the first aspect of the information processing system of the present invention is capable of analyzing an augmented reality presentation apparatus capable of combining and displaying a real space image and computer graphics image data, and a failure occurring in the computer system.
- the failure analysis apparatus includes an acquisition unit that acquires information related to an operation status of the computer system, and information related to the operation status acquired by the acquisition unit. Based on determination means for determining information related to the fault repair work for the computer system, and transmission means for transmitting information on the fault repair work determined by the determination means to the augmented reality presentation apparatus.
- the augmented reality presentation device is configured to perform the fault repair work based on the information related to the fault repair work.
- the present invention has a presentation means for synthesizing and presenting computer graphics image data that guides a method with a video of a real space, and the failure analysis device is configured to perform a fault repair work by guidance presented by the augmented reality presentation device,
- the determination means newly determines information related to the fault repair work for the computer system based on the information related to the operation status.
- the transmission means transmits information on the newly determined failure repair work to the augmented reality presentation device.
- An information processing method includes an augmented reality presentation device capable of combining and displaying a video in real space and computer graphics image data, a failure analysis device capable of analyzing a failure occurring in a computer system, and
- the failure analysis apparatus is an information processing method executed by the acquisition step of acquiring information related to the operating status of the computer system, and based on the information related to the operating status acquired by the acquiring step, A determination step for determining information related to the fault repair work for the computer system; and a transmission step for transmitting information related to the fault repair work determined in the determination step to the augmented reality presentation device.
- the sensation presentation device is configured to perform the fault repair based on the information related to the fault repair work.
- Including a presentation step of synthesizing and presenting computer graphics image data that guides a method of work with a video in a real space and the failure analysis device is configured to perform a failure repair work by guidance presented by the augmented reality presentation device.
- the information related to the operating status of the computer system is newly acquired, the information related to the fault repair work for the computer system is newly determined based on the information related to the operating status, and the newly determined fault repair is determined.
- Information on work is transmitted to the augmented reality presentation device.
- the program of the present invention includes an augmented reality presentation device capable of combining and displaying real-space video and computer graphics image data, and a failure analysis device capable of analyzing a failure occurring in the computer system.
- a program for causing a computer to execute an information processing method in an information processing system wherein the failure analysis device acquires information relating to an operating status of the computer system, and the failure analysis device includes the acquisition step.
- Transmission to the augmented reality presentation device And a presenting step in which the augmented reality presentation device synthesizes and presents computer graphics image data for guiding the method of the fault repair work with a video in the real space based on the information on the fault repair work.
- the failure analysis device includes a step of determining when the information relating to the operating status of the computer system is newly acquired by the acquisition step after the failure repair work by the guidance presented by the augmented reality presentation device.
- the failure analysis device includes a step of determining when the information relating to the operating status of the computer system is newly acquired by the acquisition step after the failure repair work by the guidance presented by the augmented reality presentation device.
- an augmented reality presentation device capable of combining and displaying real-space video and computer graphics image data, and analyzing a failure occurring in a repair object
- the augmented reality presentation device includes a transmission unit, and the augmented reality presentation device guides a method of the fault repair work based on information on the fault repair work. And presenting means for synthesizing and presenting the graphics image data with the image of the real space, and the failure analysis device is configured to repair the object after the failure repair work by the guidance presented by the augmented reality presentation device.
- the determination means newly determines information related to the obstacle repair work for the repair object based on the information related to the operation status
- the transmission means transmits information on the newly determined fault repair work to the augmented reality presentation device.
- FIG. 1 is a diagram schematically showing a configuration of a failure analysis system according to an embodiment of the present invention.
- FIG. 2A is a diagram illustrating a hardware configuration of the failure analysis apparatus.
- FIG. 2B is a diagram illustrating a hardware configuration of the HMD and the small camera.
- FIG. 3 is a flowchart showing the flow of processing of the failure analysis apparatus.
- FIG. 4A is a flowchart showing the processing of the HMD.
- FIG. 4B is a flowchart showing the processing of the HMD.
- FIG. 4C is a flowchart showing the processing of the HMD.
- FIG. 4A is a flowchart showing the processing of the HMD.
- FIG. 4B is a flowchart showing the processing of the HMD.
- FIG. 4C is a flowchart showing the processing of the HMD.
- FIG. 5 is a diagram illustrating an example of a state in which the computer graphics image data for guiding the route from the current location to the proxy server c that is the target object of the fault repair work is combined and displayed on the real space video in the HMD.
- FIG. 6 is a diagram illustrating an example of a state in which computer graphics image data for guiding a method of repairing a failure is combined and displayed on a real-space image in the HMD.
- FIG. 7 is a diagram illustrating an example of a state in which computer graphics image data representing a failure repair completion notification is synthesized and displayed on a real-space video in the HMD.
- FIG. 8 is a diagram illustrating an example of rule data.
- FIG. 9 is a diagram illustrating an example of the proficiency level determination table.
- FIG. 10 is a diagram illustrating an example of the work content description table.
- FIG. 11 is a diagram illustrating an example of a worker table.
- FIG. 12 is a diagram illustrating an example of a work content history table.
- FIG. 13 is a diagram illustrating an example of the required capability table.
- FIG. 14 is a diagram illustrating an example of the configuration of the information processing system.
- FIG. 15 is a diagram illustrating a functional configuration of the location management server and the HMD.
- FIG. 16 is a flowchart illustrating processing of the information processing system.
- FIG. 17 is a diagram for explaining a signal transmitted from the RFID reader.
- FIG. 18 is a diagram illustrating an example in which the network cable laid on the free access floor is displayed as AR on the floor surface.
- FIG. 19 is a diagram for explaining an example of a configuration for specifying the position of the network cable laid on the free access floor.
- FIG. 20 is a diagram for explaining an example of a configuration for specifying the position of the
- FIG. 1 is a diagram schematically showing a configuration of a failure analysis system according to an embodiment of the present invention.
- the failure analysis system acquires information related to the operation status of various servers and routers 111 to 118 installed in a data center, and determines a failure repair work method.
- An analysis device 120 is provided. The determination of the failure repair work method is executed, for example, when there is a complaint from the user that the Internet cannot be used.
- the failure analysis apparatus 120 determines the failure repair work method
- the failure analysis apparatus 120 provides the guidance information to the place of the failure repair work and the guide information of the failure repair work method for the glasses-type worn on the heads of the workers y and x.
- HMD Head Mounted Display
- the HMDs 101 and 104 synthesize and display computer graphics image data that guides the workers y and x to the location of the fault repair work, and computer graphics image data that guides the method of the fault repair work, with video in the real space. To do.
- the HMDs 101 and 104 produce an expanded sense of reality by displaying computer graphics image data at positions that match optical images that are optically transmitted and projected on the lens portions of the HMDs 101 and 104.
- AR Augmented Reality
- HMD There are various types of HMD such as non-transmission type, video transmission type, and optical transmission type.
- the composite display is to superimpose and display computer graphics image data at a matching position on the real space image displayed on the lens portion of the HMD by any method.
- the image is superimposed and displayed on the image of the real space seen through, and in the case of the video transmission type, the image is displayed on the image of the real space captured by the video camera.
- an eyeglass-type HMD is cited as an example of an AR display device that is a device that performs AR display.
- a mobile terminal AR display device that displays a real-space image captured by a camera on a liquid crystal display or the like, and displays and displays computer graphics image data on the displayed real-space image may be used.
- It may be a head-up display AR display device that is installed in the line-of-sight direction of the driver's seat of the vehicle and displays computer graphics image data in a composite image of real space that is optically transmitted and projected.
- Computer graphics image data when computer graphics image data is displayed in alignment with an image in the real space, coordinates between the object in the image in the real space and the computer graphics image data are performed.
- coordinate alignment based on the position (latitude, longitude) and posture (direction, elevation angle) of the HMD, it is estimated where the worker wearing the HMD is looking in the real space and matches the estimated position.
- Computer graphics image data may be synthesized, or a marker is attached to an object in the real space, the marker is photographed with a camera, and the position of the marker is detected from the photographed image data.
- the computer graphics image data may be synthesized so as to match the determined position.
- the position of the object in the real space may be detected by analyzing the image data in the real space captured by the camera, and the computer graphics image data may be synthesized so as to match the detected position.
- small cameras 102 and 105 are attached to the HMDs 101 and 104 so that image data can be taken at an angle of view close to the field of view of the workers x and y.
- the captured image data is transmitted to the failure analysis apparatus 120 via a wireless communication line.
- the failure analysis device 120 temporarily stores the received image data in the storage device 122 and then analyzes the image data to detect the presence of the servers and routers 111 to 118 reflected in the image data, and to detect the servers and routers 111 to 118.
- the operation state of 118 is determined.
- the failure analysis apparatus 120 registers the image data of the servers and routers 111 to 118 in advance, and matches the captured image data with the image data of the servers and routers 111 to 118, so that the servers and routers are matched. The presence of 111 to 118 is detected.
- the failure analysis apparatus 120 analyzes the image data of the power lamp portions of the servers and routers 111 to 118, and if the power lamp can be detected, the server is turned on. If it is determined that the power lamp is lit, it is determined that the server is turned off.
- the failure analysis apparatus 120 acquires log data from the servers and routers 111 to 118 installed in the data center by the monitoring unit 121.
- Examples of the log data acquired here include CPU usage rate, memory usage rate, power supply voltage, temperature, communication speed, number of accesses, and the like.
- Log data acquired by the monitoring unit 121 is held in the storage device 122.
- the storage device 122 stores past failure response data indicating a response method actually performed when a past failure occurred, and rule data indicating a predetermined response rule when a failure occurs. For example, when a complaint is received that the Internet cannot be used, the instruction content generation unit 123 analyzes the latest log data and image data to recognize the operating status of the corresponding server and router, and to deal with the failure that has occurred. Relevant rule data and past failure handling data are acquired from the storage device 122 to determine the method of failure repair work.
- the failure analysis apparatus 120 acquires map data including the current location of the HMDs 101 and 104 (the workers y and x wearing the HMDs 101 and 104) and the location of the failure repair work. Then, guidance information for guiding the route from the current location to the location of the fault repair work is generated. The generated guidance information is transmitted to the HMDs 101 and 104.
- the HMDs 101 and 104 combine the computer graphics image data for guiding the route from the current location to the location of the fault repair work with the video in the real space and display the AR. .
- GPS Global Positioning system
- the instruction content generation unit 123 transmits the fault repair work method to the HMDs 101 and 104.
- the HMDs 101 and 104 synthesize computer graphics image data for guiding a method for repairing a failure with a video in the real space and display the AR. Thereby, the workers y and x can perform the fault repair work according to the guidance.
- the small cameras 102 and 105 photograph the image data of the servers and routers 111 to 118 that are the target of the fault repair work, and the instruction content generation unit 123 takes the image data. It is possible to detect by analyzing.
- the failure analysis apparatus 120 is also connected to the PCs 130 of supervisors z of workers y and x in remote locations of the data center via the Internet.
- the failure analysis apparatus 120 analyzes the content of the e-mail and determines the method of the failure repair work in consideration of the analyzed content.
- FIG. 2A is a diagram illustrating a hardware configuration of the failure analysis apparatus 120 according to the present embodiment.
- the CPU 201 comprehensively controls each device and controller connected to the system bus.
- the ROM 203 or HD (hard disk) 207 stores a BIOS (Basic Input / Output Output System), an operating system program, a program for the processing shown in FIG. .
- BIOS Basic Input / Output Output System
- the HD 207 is configured to be disposed inside the failure analysis apparatus 120.
- a configuration corresponding to the HD 207 may be configured to be disposed outside the failure analysis apparatus 120.
- the program for performing the processing shown in FIG. 3, for example, according to the present embodiment is recorded on a computer-readable recording medium such as a flexible disk (FD) or CD-ROM, and supplied from these recording media.
- FD flexible disk
- CD-ROM compact disc-read only memory
- it may be configured to be supplied via a communication medium such as the Internet.
- the RAM 202 functions as a main memory, work area, and the like for the CPU 201.
- the CPU 201 implements various operations by loading a program necessary for execution of processing into the RAM 202 and executing the program.
- HD 207 and FD 206 function as an external memory.
- the CPU 201 implements various operations by loading a program necessary for execution of processing into the RAM 202 and executing the program.
- the disk controller 205 controls access to external memories such as the HD 207 and the FD 206.
- the communication I / F controller 204 is connected to the Internet or a LAN, and controls communication with the outside by, for example, TCP / IP.
- the display controller 208 controls image display on the display 209.
- the KB controller 210 receives an operation input from the KB (keyboard) 211 and transmits it to the CPU 201.
- a pointing device such as a mouse can also be applied to the failure analysis apparatus 120 according to the present embodiment as a user operation means.
- the instruction content generation unit 123 shown in FIG. 1 has a configuration realized by, for example, a program stored in the HD 207 and loaded into the RAM 202 as necessary and the CPU 201 executing the program.
- the monitoring unit 121 has a configuration corresponding to the communication I / F controller 204, and the storage device 122 has a configuration corresponding to the HD 207.
- FIG. 2B is a diagram illustrating a hardware configuration of the HMDs 101 and 104 and the small cameras 102 and 105 according to the present embodiment.
- the CPU 301 comprehensively controls each device and controller connected to the system bus.
- the ROM 302 stores, for example, processing programs shown in FIGS. 4A to 4C executed by the HMDs 101 and 104. 4A to 4C may be supplied via a communication medium such as the Internet.
- the RAM 304 functions as a main memory, work memory, and the like for the CPU 301.
- the CPU 301 implements various operations by loading a program necessary for execution of processing into the RAM 304 and executing the program.
- the communication I / F controller 303 is connected to the Internet or a LAN, and controls communication with the outside by TCP / IP, for example.
- the imaging unit 305 converts the subject image incident through the optical lens and formed on the imaging element into an electrical signal, and outputs moving image data or still image data.
- the display controller 306 controls image display on the display 307.
- an optically transmissive head-mounted display is employed, the display 307 is configured by a half mirror, and the user who wears the camera can view the outside through the display 307.
- the imaging unit 305 has a configuration corresponding to the small cameras 102 and 105 in FIG.
- FIG. 3 is a flowchart showing a processing flow of the failure analysis apparatus 120 according to the present embodiment.
- the process illustrated in FIG. 3 is a process that is started when the administrator inputs the content of the complaint to the failure analysis apparatus 120 when there is a complaint from the user, for example, that the Internet cannot be used.
- step S301 the instruction content generation unit 123 acquires the latest log data and image data, and necessary rule data and past failure handling data from the storage device 122.
- Log data and image data stored in the storage device 122 are acquired by the monitoring unit 121.
- the monitoring unit 121 may acquire log data at a timing when the log data is updated in each server or router 111 to 118, or may acquire log data at a regular timing.
- the monitoring unit 121 acquires image data at the timing when the small cameras 102 and 105 capture image data.
- step S ⁇ b> 302 the instruction content generation unit 123 uses the existing decision tree algorithm, for example, based on the log data, image data, rule data, and past failure handling data acquired from the storage device 122. Determine.
- “who (worker object) performs what (operation object) with respect to which device (target object)” is determined as a method of failure repair work.
- worker x (worker object) performs a restart (operation object) on Web server a 117 (target object)
- worker y worker object moves to Web server b 112 (target object). Assume that it is determined that a restart (operation object) is performed on the object.
- step S303 the instruction content generation unit 123 determines whether the failure repair work has been completed. When the failure repair work is completed, the process proceeds to step S307. On the other hand, if the failure repair work has not been completed, the process proceeds to step S304. Here, since the method of the fault repair work is determined in the immediately preceding step S302, it is determined that the fault repair work has not been completed. Therefore, the process proceeds to step S304.
- step S304 the instruction content generation unit 123 acquires map data including the current location of the HMDs 101 and 104 and the position of the target object, and provides guidance information for guiding a route from the current value of the HMDs 101 and 104 to the target object. It is generated and transmitted to the HMDs 101 and 104. In other words, the instruction content generation unit 123 transmits guidance information for guiding the route from the current location of the worker y wearing the HMD 101 to the Web server b 112 to the HMD 101, and the HMD 104 to the HMD 104. Guidance information for guiding the route from the current location of the attached worker x to the Web server a 117 is transmitted.
- step S305 the instruction content generation unit 123 analyzes whether or not the workers y and x wearing the HMDs 101 and 104 have arrived at the target object by analyzing the image data acquired from the storage device 122 in step S301. judge. That is, here, the instruction content generation unit 123 analyzes the image data captured by the small cameras 102 and 105, and as a result, the image data of the Web server b 112 and the Web server a 117 is an area in which the image data of the Web server b 112 or the Web server a 117 exceeds a certain ratio. Is determined to have arrived at the target object. If the target object has been reached, the process proceeds to step S306. On the other hand, if it has not arrived at the target object, the process returns to step S304.
- step S306 the instruction content generation unit 123 transmits a failure repair work method to the HMDs 101 and 104.
- the instruction content generation unit 123 transmits to the HMD 101 a failure repair work method with the content “worker y performs restart to the Web server b 112”.
- the instruction content generation unit 123 transmits to the HMD 104 a failure repair work method with the content “worker x restarts the Web server a 117”.
- computer graphics image data for guiding the method of the fault repair work “worker y restarts the Web server b 112” is displayed in combination with the real space on the HMD 101.
- the HMD 104 displays computer graphics image data that guides the method of fault repair work “worker x restarts the Web server a 117” in combination with the real space. Workers y and x perform fault repair work according to guidance by computer graphics image data displayed on the HMDs 101 and 104, respectively. If a change in the CPU usage rate, memory usage rate, communication speed, number of accesses, etc. of the servers and routers 111 to 118 occurs due to the fault repair work, the server and routers 111 to 118 update the log data.
- step S306 the process returns to step S301.
- step S ⁇ b> 301 the instruction content generation unit 123 again acquires the latest log data and image data, necessary rule data, and past failure handling data from the storage device 122.
- step S ⁇ b> 302 the instruction content generation unit 123 uses the existing decision tree algorithm, for example, based on the log data, image data, rule data, and past failure handling data acquired from the storage device 122. Determine.
- the worker x (worker object) performs a restart (operation object) on the proxy server c115 (target object), and the worker y (worker object) Assume that it is determined that the proxy server d111 (target object) is to confirm the setting file (operation object).
- step S303 the instruction content generation unit 123 determines whether the failure repair work has been completed.
- the process proceeds to step S304.
- step S304 the instruction content generation unit 123 generates guidance information for guiding a route from the Web server b112, which is the current location of the HMD 101, to the proxy server d111, which is the target object, using the same method as described above. It transmits to HMD101. Similarly, the instruction content generation unit 123 generates guidance information for guiding a route from the Web server a 117 that is the current location of the HMD 104 to the proxy server c 115 that is the target object, and transmits the guidance information to the HMD 104.
- step S305 the instruction content generation unit 123 determines whether or not the workers y and x wearing the HMDs 101 and 104 have arrived at the proxy server d111 and the proxy server c115, which are target objects, by the same method as described above. Determine whether.
- the instruction content generation unit 123 transmits the content of the fault repair work to the HMDs 101 and 104 in step S306. That is, the instruction content generation unit 123 transmits to the HMD 101 a failure repair work method with the content “worker y checks the setting file with respect to the proxy server d111”.
- the instruction content generation unit 123 transmits to the HMD 104 a failure repair work method with the content “worker x restarts the proxy server c115”.
- the computer graphics image data for guiding the method of the fault repair work “worker y confirms the setting file to the proxy server d111” is synthesized and displayed on the HMD 101.
- the HMD 104 also displays computer graphics image data that guides a failure repair work method “worker x restarts the proxy server c115”. Workers y and x perform fault repair work according to guidance by computer graphics image data displayed on the HMDs 101 and 104, respectively. The log data is updated by this failure repair work.
- the supervisor z of the workers y and x recalls that there was a hardware failure of the router e116 on the previous day, and that the response was made.
- To the mail address of the worker x “There was a hardware failure of the router e116 on the previous day. It is assumed that an e-mail stating “A response has been made” has been sent. This e-mail is acquired by the monitoring unit 121 in step S302. The acquired electronic mail is stored in the storage device 122.
- step S306 the process returns to step S301.
- step S ⁇ b> 301 the instruction content generation unit 123 acquires the e-mail from the storage device 122 in addition to the latest log data and image data, necessary rule data, and past failure handling data.
- the instruction content generation unit 123 uses, for example, an existing decision tree algorithm based on the above-described e-mail in addition to the log data, image data, rule data, and past failure response data acquired from the storage device 122. Determine the method of repair work.
- the analysis contents of the e-mail are reflected, and as a failure repair work method, the worker x (worker object) confirms the information in the routing table (operation object) with respect to the router e 116 (target object). Is determined to be performed.
- step S303 the instruction content generation unit 123 determines whether the failure repair work has been completed.
- the process proceeds to step S304.
- step S304 the instruction content generation unit 123 generates guidance information for guiding a route from the proxy server c115, which is the current location of the HMD 104, to the router e116, which is the target object, by the same method as described above. Send to.
- the guidance information for the HMD 101 is not generated because there is no change in the target object in the fault repair work for the worker x wearing the HMD 101.
- step S305 the instruction content generation unit 123 determines whether the worker x wearing the HMD 104 has arrived at the router e116, which is the target object, by the same method as described above. If it is determined that the worker x wearing the HMD 104 has arrived at the router e116 that is the target object, the instruction content generation unit 123 transmits the content of the fault repair work to the HMD 104 in step S306. That is, the instruction content generation unit 123 transmits to the HMD 104 a failure repair work method with the content “worker x confirms routing table information to the router e 116”. As a result, the computer graphics image data for guiding the method of the fault repair work “worker x confirms the routing table information to the router e 116” is synthesized and displayed on the HMD 104.
- the worker x when the worker x confirms the routing table information according to the guidance of the method of repairing the failure, the worker x has confirmed that the routing table information should have been changed due to hardware failure handling, but has not been changed. To do.
- the worker x captures the display on which the routing table information is displayed with the small camera 105 and transmits the routing table information to the failure analysis apparatus 120 as image data.
- the failure analysis device 120 stores the image data in the storage device 122.
- step S306 the process returns to step S301.
- step S ⁇ b> 301 the instruction content generation unit 123 again acquires the latest log data and image data, necessary rule data, and past failure handling data from the storage device 122.
- step S302 the instruction content generation unit 123 uses the existing decision tree algorithm, for example, based on the log data, the image data, the rule data, and the past failure handling data acquired from the storage device 122. Determine.
- the worker x it is detected by the analysis of the image data that the information in the routing table should have been changed but not changed, and as a failure repair work method, the worker x (worker object) is the router e116 (target object).
- the information of the routing table is changed (operation object).
- step S303 the instruction content generation unit 123 determines whether the failure repair work has been completed.
- the process proceeds to step S304.
- step S304 the instruction content generation unit 123 does not generate guidance information for the HMDs 101 and 104.
- step S305 the instruction content generation unit 123 determines that both the workers y and x wearing the HMDs 101 and 104 have arrived at the target object. Therefore, the process proceeds to step S306.
- step S ⁇ b> 306 the instruction content generation unit 123 transmits the content of the failure repair work to the HMD 104. That is, the instruction content generation unit 123 transmits to the HMD 104 a failure repair work method “worker x changes the routing table information to the router e 116”. As a result, the computer graphics image data for guiding the failure repair work method “worker x changes the information in the routing table to the router e 116” is synthesized and displayed on the HMD 104.
- the log data of the server and routers 111 to 118 is updated and the lighting state of the server and router lamps is changed.
- the captured image data is also updated.
- the log data and the image data updated in this way are stored in the storage device 122 of the failure analysis device 120.
- step S306 the process returns to step S301.
- step S ⁇ b> 301 the instruction content generation unit 123 again acquires the latest log data and image data, necessary rule data, and past failure handling data from the storage device 122.
- step S ⁇ b> 302 the instruction content generation unit 123 uses the existing decision tree algorithm, for example, based on the log data, image data, rule data, and past failure handling data acquired from the storage device 122. Determine.
- the instruction content generation unit 123 detects the result of analyzing the log data, for example, by detecting that the CPU usage rate has decreased, the communication speed has been improved, or the result of analyzing the image data, for example, a failure lamp. By detecting that is turned off, etc., it is determined that there is no failure, and the method of failure repair work is not determined. Accordingly, in step S303, it is determined that the fault repair has been completed, and the process proceeds to step S307.
- step S307 the instruction content generation unit 123 transmits a failure repair completion notification to the HMDs 101 and 104.
- computer graphics image data having a content such as “completely restored at XX hour” is displayed on the HMDs 101 and 104, for example. Workers y and x finish the fault repair work.
- the processing shown in FIG. 3 is not limited to the input to the failure analysis apparatus by the administrator, but may be started when the performance information of various devices in the data center exceeds a predetermined threshold.
- FIGS. 4A to 4C are flowcharts showing processing of the HMDs 101 and 104.
- FIG. 4A shows a process in which the HMDs 101 and 104 display computer graphics image data that guides the route from the current location to the target object of the fault repair work.
- FIG. 4B shows a process in which the HMDs 101 and 104 display computer graphics image data that guides the method of fault repair work.
- FIG. 4C illustrates a process in which the HMD 101 or 104 displays a failure repair completion notification.
- Each process shown in FIGS. 4A to 4C is performed by the CPU 301 reading a necessary program from the ROM 302 and executing it.
- step S ⁇ b> 401 the HMDs 101 and 104 determine whether or not guidance information for guiding a route from the current location to the target object for failure repair work has been received from the failure analysis apparatus 120.
- the process proceeds to step S402. If the guide information has not been received, the process returns to step S401 and again determines whether the guide information has been received.
- step S402 the HMDs 101 and 104 superimpose computer graphics image data that guides the route from the current location to the target object of the fault repair work on the real space video, and display the AR. As a result, the route from the current location to the target object of the fault repair work is guided to the workers y and x.
- FIG. 5 is a diagram illustrating an example of a state in which the computer graphics image data for guiding the route from the current location to the proxy server c115 that is the target object of the fault repair work is combined and displayed on the real space video in the HMD 104.
- a proxy server d111, a Web server b112, a proxy server c115, and a router e116 are real space images.
- Reference numeral 501 denotes computer graphics image data that guides the route from the current location to the proxy server c115 that is the target object of the fault repair work, and is synthesized and displayed in a state where the coordinates are matched with the video in the real space.
- reference numeral 502 denotes computer graphics image data that clearly indicates that the target object as a route guidance destination is the proxy server c115, and is combined and displayed at a predetermined position in the video displayed on the HMD 104.
- step S ⁇ b> 403 the HMDs 101 and 104 determine whether a failure repair work method has been received from the failure analysis apparatus 120. When the failure repair work method is received, the process proceeds to step S404. On the other hand, if the failure repair work method has not been received, the process returns to step S403 to determine whether the failure repair work method has been received again.
- step S ⁇ b> 404 the HMDs 101 and 104 synthesize and display computer graphics image data that guides the method of fault repair work on the video in the real space. As a result, the method of the fault repair work is guided to the workers y and x.
- FIG. 6 is a diagram showing an example of a state in which the computer graphics image data for guiding the method of repairing the failure is synthesized and displayed on the real space video in the HMD 104.
- the proxy server c115 is an image of a real space. Since the operation object of the fault repair work is restart, the computer graphics image data 601 indicating the restart button is combined and displayed at the position of the restart button.
- Reference numeral 602 denotes computer graphics image data that clearly indicates that the target object of the fault repair work is the proxy server c115 and that the operation object is restart, and is synthesized and displayed at a predetermined position in the video displayed on the HMD 104. Has been.
- step S ⁇ b> 405 the HMDs 101 and 104 determine whether a failure repair completion notification has been received from the failure analysis apparatus 120. If a failure repair completion notification is received, the process proceeds to step S406. On the other hand, if the failure repair completion notification has not been received, the process returns to step S405 to determine whether or not the failure repair completion notification has been received again.
- step S ⁇ b> 406 the HMDs 101 and 104 synthesize and display computer graphics image data representing a failure repair completion notification on the real space video. As a result, the workers y and x are notified that the fault repair work has been completed.
- FIG. 7 is a diagram illustrating an example of a state in which the computer graphics image data representing the failure repair completion notification is synthesized and displayed on the real space video in the HMD 104.
- a router e116 is a real space image.
- Reference numeral 701 denotes computer graphics image data for notifying that the failure repair work has been completed, and is synthesized and displayed at a predetermined position in the video displayed on the HMD 104.
- the worker when a complaint occurs, information related to the fault repair work corresponding to the fault that has occurred is presented to the worker.
- the worker proceeds with the work according to the information concerning the trouble repair work such as the presented trouble repair work method and the place where the trouble occurs, but the operating state of the system changes depending on the work.
- the operating state of the system changes, information related to the fault repair work including the method of the fault repair work corresponding to the system operating state after the change is newly obtained and presented to the operator.
- the failure analysis device acquires log data and image data each time, and seeks a method for repairing the failure according to the operating status of the system at that time Is possible.
- instruction content generation unit 123 determines the method of the fault repair work and uses an algorithm of an existing decision tree will be described below with reference to FIG.
- FIG. 8 is a diagram showing an example of rule data.
- the rule data shown in FIG. 8 indicates the weighting of the priority of failure repair work for each check item.
- the weighting is represented by numerical values from “0” to “10”, and the higher the numerical value, the higher the priority.
- the instruction content generation unit 123 determines each check item of the rule data shown in FIG. 8 from information (various log data, CPU usage rate, memory usage rate, communication speed, command execution result, etc.) indicating the operating status of the computer system. It is determined whether it corresponds to. Then, the instruction content generation unit 123 calculates the total value of the weights of the check items determined to correspond to the current computer system operation status for each fault repair work, and performs the fault repair work with the largest calculated total value. Instruct employees.
- the proficiency level of the worker is determined based on the execution result of the work by the worker, and according to the determined proficiency level of the determined worker, It may be configured to change the instruction content.
- the instruction content generation unit 123 measures the time from when the worker starts the work to the completion of the work after transmitting the information for instructing the failure recovery work to the worker, for example, the measured time To determine the proficiency level of the worker.
- a proficiency level determination table in which a correspondence relationship between work time and proficiency level is defined in advance is stored in the storage device 122.
- FIG. 9 is a diagram illustrating an example of the proficiency level determination table.
- the instruction content generation unit 123 determines whether or not the work performed by the worker has been correctly performed after transmitting information instructing the failure recovery work to the worker, and the rate at which the work has been performed correctly
- the configuration may be such that the proficiency level of the worker is determined based on (work success rate).
- a proficiency level determination table in which the correspondence between the degree of correct work and the proficiency level is defined in advance is stored in the storage device 122. In the example of the proficiency level determination table 2 shown in FIG. 9, for the work 1, if the success rate is 90% or more, the A determination is made, if the success rate is 70% or more, the B determination is made if the success rate is less than 90%, and the success rate is If it is less than 70%, it indicates C determination.
- the rate at which work is correctly performed may be, for example, the rate of the number of items that have been correctly worked out of the total number of work items, or the difficulty level of the work.
- the weighting may be performed using the evaluation formula defined in advance after weighting, and is not particularly limited.
- the instruction content generation unit 123 determines information on failure recovery work to be transmitted to the worker next time according to the determination result of the proficiency level of the worker. For example, even if a new worker and an experienced worker perform the same work, technical terms may not be understood for new and part-time workers. In contrast, it is better to explain the basic matters in detail to the veteran workers because the work efficiency is reduced, and it is preferable to use technical terms.
- the instruction content generation unit 123 transmits, to the new worker, information representing a failure repair work method in which the work content is described in detail in an expression that does not use technical terms, for example.
- the HMD worn by the new worker displays the AR by combining the computer graphics image data that guides the method of the fault repair work described in detail with the real space video.
- the instruction content generation unit 123 transmits information representing the method of the fault repair work in which the work content is briefly explained to the experienced worker in an expression using technical terms.
- the HMD worn by the veteran worker synthesizes computer graphics image data that guides the method of fault repair work, which briefly describes the work contents using technical terms, and displays the AR by synthesizing it with the video in the real space. .
- the content explanation table needs to be stored in the storage device 122 in advance.
- the instruction content generation unit 123 determines the proficiency level of the worker, the instruction content generation unit 123 reads the description of the work content corresponding to the determined proficiency level from the work content description table, and transmits the read description to each worker.
- FIG. 10 is a diagram showing an example of the work content explanation table.
- the instruction content generation unit 123 transmits the content of the description X to, for example, the worker who has determined the proficiency level A for the work 1.
- the failure repair work is repeatedly performed according to the work result.
- the worker is in accordance with the work content of the worker up to the previous time.
- a description of the work order at a different level is sent to each.
- work instructions can be appropriately given in accordance with the level of proficiency of the user, so that failure recovery work can be performed efficiently.
- such a configuration that takes security into consideration is based on the job title data and access authority of the worker registered in advance in the worker table or the like. May be determined.
- a worker who performs failure recovery work may be determined according to the proficiency level of the worker.
- the instruction content generation unit 123 determines the proficiency level of each worker using the proficiency level determination table for the contents of the failure recovery work performed by each worker in the same manner as the above-described configuration, and each determination result is a determination result. Write the proficiency level of the worker to the worker table.
- FIG. 11 is a diagram illustrating an example of a worker table. In the example illustrated in FIG. 11, for the worker X, the proficiency level of the work 1 is determined as A rank, and the proficiency level of the work 2 is determined as B rank.
- the proficiency level determination table is used for the work contents of the previous failure recovery work.
- the structure which determines a degree may be sufficient.
- the work content history of past failure recovery work (work time and the degree of work being correctly performed) is stored in the work content history table, and the proficiency level determination table is used based on the average value of each value of the work content history. The proficiency level may be judged. Further, a worker who has no work content history may be determined to have the lowest proficiency level, and is not particularly limited.
- FIG. 12 is a diagram illustrating an example of a work content history table. In the example shown in FIG. 12, for example, the worker with worker ID 001 (that is, worker X) has performed work 1 on March 1, 20XX with a work time of 3 minutes and a work success rate of 85%.
- FIG. 13 is a diagram illustrating an example of the required capability table.
- the minimum capability level required for the worker who performs the work 1 is C rank.
- the instruction level generation unit 123 determines the proficiency level required to take charge of the determined failure recovery work from the requested capability table. read out. Then, the instruction content generation unit 123 reads out the ability level data of each worker stored in the worker table, determines a worker having the ability level necessary to take charge of the failure recovery work, and determines Information indicating a method of repairing the failure is transmitted to the worker who has completed the process.
- the capability level data stored in the worker table may be determined based on, for example, the average value of proficiency level of each task, or may be determined by other evaluation formulas. Any value that represents a typical ability level.
- this capability level may be updated by the instruction content generation unit 123 every time each failure repair work is performed by the worker.
- the failure repair work is repeatedly performed according to the work result.
- the worker is in accordance with the work content of the worker up to the previous time. Is determined. Thereby, it becomes possible to determine an appropriate worker according to the difficulty level of the repair work as a person in charge.
- the network cable to be checked is located in the server rack under the free access floor panel. Or, it is necessary to specify where the equipment is laid in the wall surface of the facility (hereinafter referred to as “free access floor”).
- the present invention also provides a technique that makes it possible to easily find a desired network cable from the network cables laid on the free access floor using the AR technique.
- this technique will be described in detail.
- the failure analysis system superimposes and displays a virtual image representing the network cable on the floor panel in the real space or the captured image of the floor panel along the actual network cable laid on the free access floor. Is.
- FIG. 14 is a diagram illustrating an example of the configuration of the failure analysis system according to the embodiment of the present invention.
- the failure analysis system according to the embodiment of the present invention includes a network cable 801 in which RFIDs 800a, 800b, 800c,... 802d, the location management server 803, and HMD804 are comprised.
- FIG. 14 shows only the part related to the technique for displaying the network cable in AR, but the configuration shown in FIG. 14 is incorporated as part of the failure analysis system shown in FIG. That is, various configurations for supporting the failure repair work are the same as those of the failure analysis system shown in FIG.
- the network cable 801 is laid on the free access floor in the data center, and the network cable 801 is not visible by the floor panel. Further, RFID 800 is embedded in the network cable 801 at a constant interval (for example, 1 m interval), and each RFID stores an ID for identifying the network cable.
- RFID readers 802a to 802d receive radio waves from RFID 800 embedded in network cable 801.
- the location management server 803 is connected to the RFID readers 802a to 802d through a network so as to be able to communicate with the RFID readers 802a to 802d, and detects the location of each RFID 800 based on information received from the RFID readers 802a to 802d. Then, the location management server 803 transmits location information indicating the location of the detected RFID 800 to the HMD 804. Details of the location management server 803 will be described later.
- the location management server 803 and the RFID readers 802a to 802d are connected by a wireless network, but may be connected by a wired LAN.
- the hardware configuration of the location management server 803 is similar to the failure analysis apparatus 120 described with reference to FIG. 2A, and the CPU controls each device and controller connected to the system bus. In addition to BIOS and OS, programs for realizing various functions are stored.
- the HMD 804 superimposes and displays a virtual image 805 representing the network cable 801 on the floor surface along the actual position of the network cable 801 based on the position information of the RFID 800 received from the position management server 803. Note that the same functions as those of the HMD 101 described with reference to FIG. 1 are provided, and detailed description thereof is omitted.
- the hardware configuration of the HMD 804 is similar to the HMD 101 described with reference to FIG. 2B, and the CPU controls each device and controller connected to the system bus, and the memory and HD have a BIOS and OS. In addition, programs for realizing various functions are stored.
- FIG. 15 is a diagram illustrating a functional configuration of the location management server 803 and the HMD 804.
- the location management server 803 includes a signal receiving unit 8031, a location specifying unit 8032, and a location information transmitting unit 8033.
- the signal receiving unit 8031 receives signals transmitted from the RFID readers 802a to 802d.
- the position identifying unit 8032 identifies each position of the RFID 800 (RFID 800a, RFID 800b, RFID 800c,...) Provided in the network cable 801 based on the signal received by the signal receiving unit 8031 from each of the RFIDs 802a to 802d. To do.
- the position information transmission unit 8033 transmits position information indicating each position of the RFID 800 to the HMD 804.
- the HMD 804 includes a position information receiving unit 8041, a position and orientation detection unit 8042, and an image presentation unit 8043.
- the position information receiving unit 8041 receives the position information described above from the position management server 803.
- the position / orientation detection unit 8042 detects the position (for example, latitude, longitude, or locally defined position coordinates) and the attitude (for example, direction, elevation angle) of the HMD 804. Based on the position and orientation of the HMD 804 detected by the position and orientation detection unit 8042, the image presentation unit 8043 estimates where the worker wearing the HMD 804 is looking in the real space.
- the image presentation unit 8043 confirms that the worker is looking on the floor where the network cable 801 is laid on the free access floor based on the location information of the RFID 800 received by the location information receiving unit 8041.
- a virtual image 805 representing the network cable 801 is generated along the actual position of the network cable 801 laid on the free access floor, and an image on the floor surface (in the case of a transmissive HMD, a real space) Superimposed on the display.
- FIG. 16 is a flowchart showing processing of the failure analysis system according to the embodiment of the present invention.
- the signal receiving unit 8031 receives high-level signals from all of the RFID readers 802a to 802d.
- FIG. 17 is a diagram for explaining signals transmitted from the RFID readers 802a to 802d.
- a plurality of network cables are laid on the free access floor.
- an operator connects the network to the failure analysis system via an input device such as a portable terminal. Assume that an input for instructing to display the position of the cable 801 in AR is performed.
- the network cable 801 includes a plurality of RFIDs, such as RFID 800a, RFID 800b, RFID 800c,..., But the RFID readers 802a to 802d receive radio signals for all the RFIDs 800 of the network cable 801, and receive them. At this timing, a high-level waveform signal is transmitted to the position management server 803.
- each RFID 800 provided in the network cable 801 transmits a radio signal
- a radio signal for example, if the RFID 800 is a passive type
- an input for instructing the worker to display the position of the network cable 801 in AR is provided.
- the RFID readers 802a to 802d Upon receiving a signal transmitted from the RFID 800, the RFID readers 802a to 802d output a high level signal at that timing.
- the RFID reader 802a receives a signal from the RFID 801 at time t1, and outputs a high level signal at the same time t1.
- the RFID reader 802b receives a signal from the RFID 801 at time t2, and outputs a high level signal at the same time t2.
- the RFID reader 802c receives a signal from the RFID 800 at time t3 and outputs a high level signal at the same time t3.
- the RFID reader 802d receives a signal from the RFID 801 at time t4 and outputs a high level signal at the same time t4.
- the RFID readers 802a to 802d receive signals from the RFID 800a, and the signal receiving unit 8031 of the location management server 803 receives high level signals from all the RFID readers 802a to 802d.
- the RFID readers 802a to 802d receive signals from the RFID 800b, and the signal receiving unit 8031 of the location management server 803 receives high level signals from all the RFID readers 802a to 802d.
- signals from all RFIDs 800 provided in the network cable 801 are received, and the signal receiving unit 8031 of the location management server 803 receives high-level signals from all the RFID readers 802a to 802d for all the RFIDs 800.
- the position specifying unit 8032 receives the position of each RFID 800 (that is, the position of each of the RFID 800a, the RFID 800b, the RFID 800c,. ).
- a position specifying method for example, a time detection method of TDOA (Time Difference of Arrival) method can be used.
- TDOA Time Difference of Arrival
- a technique disclosed in Japanese Patent Laid-Open No. 2000-98019 can be applied.
- step S503 the position information transmission unit 8033 transmits position information indicating the position specified in step S502 to the HMD 804.
- the location management server 803 stores information related to the network cable 801 (connection source, connection destination, line speed, etc.) by an operator registering in advance, and the network cable together with the location information is stored in the HMD 804. You may send the information about.
- step S504 the location information receiving unit 8041 of the HMD 804 receives the location information and information on the network cable described above from the location management server 803.
- step S505 the position / orientation detection unit 8042 detects the position and orientation of the HMD 804.
- step S506 the image presentation unit 8043 estimates where in the real space the worker wearing the HMD 104 is looking based on the position and orientation of the HMD 804 detected by the position and orientation detection unit 8042.
- the image presentation unit 8043 confirms that the worker is looking on the floor where the network cable 801 is laid on the free access floor based on the location information of the RFID 800 received by the location information receiving unit 8041. If detected, a virtual image 805 representing the network cable 801 is generated along the actual position of the network cable 801 laid on the free access floor, and a floor image (transparent HMD) is generated as shown in FIG. In this case, it is displayed superimposed on the real space.
- FIG. 18 is a diagram illustrating an example in which the network cable laid on the free access floor is displayed as AR on the floor surface.
- the image presentation unit 8043 presents information related to the network cable 801 (connection source, connection destination, line speed, etc.) in addition to the virtual image indicating the network cable 801.
- the location of the HMD 804 is specified similarly to the location of the RFID 800.
- the HMD 804 is also equipped with an RFID, and the RFID readers 802a to 802d receive signals from the RFID of the HMD 804. It can be specified based on the phase difference.
- the relative positional relationship between the HMD 804 and the RFID 800 can be specified, and the virtual image of the network cable 801 can be displayed as an AR at an appropriate position on the HMD 804.
- FIG. 19 is a diagram for explaining an example of a configuration for specifying the position of the network cable laid on the free access floor.
- FIG. 19 is a plan view of the server room.
- a plurality of racks including a rack A and a rack B are installed on a floor panel Z.
- a free access floor is installed on a floor panel Z.
- a network cable is laid, but the network cable is hidden by the floor panel Z and cannot be normally seen.
- the coordinate is set to the floor panel Z, and the wavy line shown on the floor panel Z is a line for showing a coordinate.
- FIG. 19 shows an example in which the port a of the device installed in the rack A and the port b of the device installed in the rack B are connected by the network cable Z.
- the network cable Z is laid under the floor panel Z.
- FIG. 19 shows a line representing the position of the network cable Z on the floor panel Z.
- the network cable Z passes through the positions of the coordinates (x1, y1), (x2, y2), (x3, y3), (x4, y4) in order from the port a side. connected to b.
- the network port and coordinate information are registered by the user.
- the position management server Z uses the coordinates and network coordinates as the network cable position information to the HMD worn by the worker. Send port information. Furthermore, when the HMD attached to the operator determines that the operator is viewing the floor panel Z where the network cable Z to be confirmed is laid on the free access floor from the HMD, the HMD connects the above coordinates. A virtual image representing a line to be generated is generated, and the generated virtual image is superimposed on the video of the floor panel Z and presented to the worker.
- FIG. 20 is a diagram for explaining another example of the configuration for specifying the position of the network cable laid on the free access floor.
- FIG. 20 is a plan view of the server room.
- a plurality of racks including a rack A and a rack B are installed on the floor panel Z ′.
- the floor panel Z ' is a free access floor and a network cable is laid, but the network cable is hidden by the floor panel Z' and cannot be normally seen.
- coordinates are set on the floor panel Z ', and the wavy line shown on the floor panel Z' is a line for indicating the coordinates.
- FIG. 20 shows an example in which the port a of the device installed in the rack A and the port b of the device installed in the rack B are connected by the network cable Z ′.
- the network cable Z ′ is laid under the floor panel Z ′, and FIG. 20 shows a line representing the position of the network cable Z ′ on the floor panel Z ′.
- the network cable Z ′ includes RFIDs at predetermined intervals. Each RFID embedded in the network cable Z 'stores an ID for identifying the network cable Z' from other network cables.
- the floor panel Z ' is provided with an RFID reader at the point where the wavy lines intersect. This RFID reader can communicate with the position management server Z '.
- the RFID reader provided at the point where the wavy lines of the floor panel Z ′ intersect each other reads a signal transmitted by the RFID provided in the network cable and transmits it to the position management server Z ′. At this time, each RFID reader transmits the intensity of the read signal to the location management server Z ′ in addition to the signal representing the ID of the network cable Z ′ emitted from the RFID.
- the location management server Z ′ Based on the signal received from the RFID reader and the signal strength, the location management server Z ′ specifies the coordinates of the RFID reader that has read a signal of a certain strength or higher with the signal from the RFID embedded in the network cable Z ′. . That is, for an RFID reader that has received a signal of a certain strength or higher, the network cable Z ′ passes through the vicinity of the RFID reader, and therefore the network cable Z ′ is identified by the location management server Z ′. It will pass through the coordinates. In the example shown in FIG. 20, the RFID reader provided at the coordinates indicated by the white circles has received a signal having a certain strength. In addition, it is assumed that the port to which the network cable Z ′ is connected is registered in advance in the location management server Z ′ by the user.
- the location management server Z ′ provides the above-described network cable location information to the HMD worn by the operator. Sends coordinates and port information represented by white circles. Further, the HMD worn by the worker determines that the operator is viewing the floor panel Z ′ where the network cable Z ′ to be confirmed is laid on the free access floor from the HMD. A virtual image representing a line connecting the two is generated, and the generated virtual image is superimposed on the video of the floor panel Z and presented to the worker.
- the worker is presented with a virtual image representing the network cable on the floor surface along the position of the network cable laid on the free access floor via the HMD.
- the failure analysis system when the worker receives a work instruction for confirming the network cable as the content of the failure repair work, the target network cable is free accessed. Since it can be easily found where the floor is laid, it is possible to improve the work efficiency of the worker.
- the embodiment of the present invention can be realized by a computer executing a program.
- means for supplying a program to a computer for example, a computer-readable recording medium such as a CD-ROM recording such a program, or a transmission medium for transmitting such a program can also be applied as an embodiment of the present invention.
- a program product such as a computer-readable recording medium that records the program can also be applied as an embodiment of the present invention.
- the programs, computer-readable recording media, transmission media, and program products are included in the scope of the present invention.
- the present invention is useful for augmented reality that can superimpose and display real-space video and computer graphics image data.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Optics & Photonics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Human Computer Interaction (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Debugging And Monitoring (AREA)
Abstract
Description
本発明の情報処理方法は、現実空間の映像とコンピュータグラフィックス画像データとを合成して表示させることが可能な拡張現実感提示装置と、コンピュータシステムに発生した障害を解析可能な故障解析装置とによって実行される情報処理方法であって、前記故障解析装置は、前記コンピュータシステムの稼働状況に係る情報を取得する取得ステップと、前記取得ステップにより取得された前記稼働状況に係る情報に基づいて、前記コンピュータシステムに対する障害修復作業に関する情報を判定する判定ステップと、前記判定ステップにより判定された前記障害修復作業に関する情報を前記拡張現実感提示装置に対して送信する送信ステップとを含み、前記拡張現実感提示装置は、前記障害修復作業に係る情報に基づいて、前記障害修復作業の方法を案内するコンピュータグラフィックス画像データを現実空間の映像と合成して提示する提示ステップを含み、前記故障解析装置は、前記拡張現実感提示装置によって提示された案内による障害修復作業の後、前記コンピュータシステムの稼働状況に係る情報を新たに取得した場合、当該稼働状況に係る情報に基づいて、前記コンピュータシステムに対する障害修復作業に係る情報を新たに判定し、新たに判定された障害修復作業に関する情報を前記拡張現実提示装置に対して送信することを特徴とする。
本発明のプログラムは、現実空間の映像とコンピュータグラフィックス画像データとを合成して表示させることが可能な拡張現実感提示装置と、コンピュータシステムに発生した障害を解析可能な故障解析装置とを有する情報処理システムにおける情報処理方法をコンピュータに実行させるためのプログラムであって、前記故障解析装置が、前記コンピュータシステムの稼働状況に係る情報を取得する取得ステップと、前記故障解析装置が、前記取得ステップにより取得された前記稼働状況に係る情報に基づいて、前記コンピュータシステムに対する障害修復作業に関する情報を判定する判定ステップと、前記故障解析装置が、前記判定ステップにより判定された前記障害修復作業に関する情報を前記拡張現実感提示装置に対して送信する送信ステップと、前記拡張現実感提示装置が、前記障害修復作業に関する情報に基づいて、前記障害修復作業の方法を案内するコンピュータグラフィックス画像データを現実空間の映像と合成して提示する提示ステップとを含み、前記故障解析装置が、前記拡張現実感提示装置によって提示された案内による障害修復作業の後、前記コンピュータシステムの稼働状況に係る情報が前記取得ステップによって新たに取得された場合、前記判定ステップにより、当該稼働状況に係る情報に基づいて、前記コンピュータシステムに対する障害修復作業に関する情報を新たに判定し、前記送信ステップにより、新たに判定された障害修復作業に関する情報を前記拡張現実提示装置に対して送信するステップをコンピュータに実行させることを特徴とする。
本発明の情報処理システムの第2の態様は、現実空間の映像とコンピュータグラフィックス画像データとを合成して表示させることが可能な拡張現実感提示装置と、修理対象物に発生した障害を解析可能な故障解析装置とを有する情報処理システムであって、前記故障解析装置は、前記修理対象物の稼働状況に係る情報を取得する取得手段と、前記取得手段により取得された前記稼働状況に係る情報に基づいて、前記修理対象物に対する障害修復作業に関する情報を判定する判定手段と、前記判定手段により判定された前記障害修復作業に関する情報を示す情報を前記拡張現実感提示装置に対して送信する送信手段とを有し、前記拡張現実感提示装置は、前記障害修復作業に関する情報に基づいて、前記障害修復作業の方法を案内するコンピュータグラフィックス画像データを現実空間の映像と合成して提示する提示手段を有し、前記故障解析装置は、前記拡張現実感提示装置によって提示された案内による障害修復作業の後、前記修理対象物の稼働状況に係る情報が前記取得手段によって新たに取得された場合、前記判定手段により、当該稼働状況に係る情報に基づいて、前記修理対象物に対する障害修復作業に関する情報を新たに判定し、前記送信手段により、新たに判定された障害修復作業に関する情報を前記拡張現実提示装置に対して送信することを特徴とする。
図1に示すように、本実施形態に係る故障解析システムは、データセンタに設置される各種サーバやルータ111~118等の稼働状況に係る情報を取得し、障害修復作業の方法を判定する故障解析装置120を備える。なお、障害修復作業の方法の判定は、例えば、インターネットが使用できない等のユーザからの苦情があった場合等に実行される。故障解析装置120は、障害修復作業の方法を判定すると、障害修復作業の場所までの案内情報や障害修復作業の方法の案内情報を、作業員y、xの頭部に装着される眼鏡型のHMD(Head Mounted Display)101、104に対して無線通信回線を介して送信する。HMD101、104は、作業員y、xを障害修復作業の場所まで案内するコンピュータグラフィックス画像データや、障害修復作業の方法を案内するコンピュータグラフィックス画像データを、現実空間の映像と合成して表示する。即ち、HMD101、104は、光学的に透過してHMD101、104のレンズ部分に映し出される現実空間の映像に整合する位置にコンピュータグラフィックス画像データを表示することにより、拡張された現実感を演出する。以下の説明では、このように拡張された現実感を与えるコンピュータグラフィックス画像データの合成表示を、AR(Augmented Reality)表示と称することがある。なお、HMDには、非透過型やビデオ透過型、光学透過型等様々な方式がある。合成表示(AR表示)とは、いずれかの方式によってHMDのレンズ部分に映し出される現実空間の映像に対し、整合する位置にコンピュータグラフィックス画像データを重畳表示することであって、例えば、光学透過型方式であれば、シースルーで見る現実空間の映像に対して重畳表示され、ビデオ透過型であれば、ビデオカメラで撮影した現実空間の映像に対して重畳表示される。
Claims (9)
- 現実空間の映像とコンピュータグラフィックス画像データとを合成して表示させることが可能な拡張現実感提示装置と、コンピュータシステムに発生した障害を解析可能な故障解析装置とを有する情報処理システムであって、
前記故障解析装置は、
前記コンピュータシステムの稼働状況に係る情報を取得する取得手段と、
前記取得手段により取得された前記稼働状況に係る情報に基づいて、前記コンピュータシステムに対する障害修復作業に関する情報を判定する判定手段と、
前記判定手段により判定された前記障害修復作業に関する情報を前記拡張現実感提示装置に対して送信する送信手段とを有し、
前記拡張現実感提示装置は、
前記障害修復作業に関する情報に基づいて、前記障害修復作業の方法を案内するコンピュータグラフィックス画像データを現実空間の映像と合成して提示する提示手段を有し、
前記故障解析装置は、
前記拡張現実感提示装置によって提示された案内による障害修復作業の後、前記コンピュータシステムの稼働状況に係る情報が前記取得手段によって新たに取得された場合、前記判定手段により、当該稼働状況に係る情報に基づいて、前記コンピュータシステムに対する障害修復作業に関する情報を新たに判定し、前記送信手段により、新たに判定された障害修復作業に関する情報を前記拡張現実提示装置に対して送信することを特徴とする情報処理システム。 - 前記故障解析装置は、前記判定手段により、前記コンピュータシステムの稼働状況に係る情報に基づいて、複数の作業員が前記コンピュータシステムに対して個々に実行すべき障害修復作業に関する情報を判定し、前記送信手段により、前記各障害修復作業の方法を夫々該当する作業員の前記拡張現実提示装置に対して送信することを特徴とする請求項1に記載の情報処理システム。
- 前記故障解析装置は、前記送信手段により、前記拡張現実感提示装置の現在地から前記障害修復作業の場所までの経路を案内するための案内情報を前記拡張現実感提示装置に対して送信し、
前記拡張現実感提示装置は、前記提示手段により、前記案内情報に基づいて、前記拡張現実感提示装置の現在地から前記障害修復作業の場所までの経路を案内するコンピュータグラフィックス画像データを現実空間の映像と合成して提示することを特徴とする請求項1に記載の情報処理システム。 - 前記判定手段は、前記拡張現実感提示装置によって提示された案内による障害修復作業の実行結果に基づいて当該作業を行った作業員の習熟度を判定し、
前記故障解析装置は、
前記拡張現実感提示装置によって提示された案内による障害修復作業の後、前記コンピュータシステムに対する障害修復作業に関する情報を新たに判定した場合、新たに判定された障害復旧作業に関する情報について、前記習熟度に応じた作業内容の説明を表す情報を前記拡張現実提示装置に対して送信することを特徴とする請求項1に記載の情報処理システム。 - 前記判定手段は、前記拡張現実感提示装置によって提示された案内による障害修復作業の実行結果に基づいて当該作業を行った作業員の習熟度を判定し、
前記故障解析装置は、
前記判定手段により、前記コンピュータシステムの稼働状況に係る情報に基づいて、前記複数の作業員が前記コンピュータシステムに対して個々に実行すべき障害修復作業に関する情報を判定する際、前記習熟度に応じて前記作業員の担当する障害修復作業を決定することを特徴とする請求項2に記載の情報処理システム。 - 前記故障解析装置は、フリーアクセスフロアに敷設されているネットワークケーブルの位置を特定する位置特定手段をさらに備え、
前記送信手段は、前記判定手段により判定された前記障害修復作業に関する情報の一部として、前記位置特定手段によって特定された前記ネットワークケーブルの位置を表す位置情報を前記拡張現実感提示装置に対して送信し、
前記拡張現実感提示装置は、前記提示手段が、前記障害修復作業に関する情報に含まれる前記位置情報基づいて、前記障害修復作業の方法を案内するコンピュータグラフィックス画像データとして、前記ネットワークケーブルを現実空間の映像と合成して提示することを特徴とする請求項1に記載の情報処理システム。 - 現実空間の映像とコンピュータグラフィックス画像データとを合成して表示させることが可能な拡張現実感提示装置と、コンピュータシステムに発生した障害を解析可能な故障解析装置とによって実行される情報処理方法であって、
前記故障解析装置は、
前記コンピュータシステムの稼働状況に係る情報を取得する取得ステップと、
前記取得ステップにより取得された前記稼働状況に係る情報に基づいて、前記コンピュータシステムに対する障害修復作業に関する情報を判定する判定ステップと、
前記判定ステップにより判定された前記障害修復作業に関する情報を前記拡張現実感提示装置に対して送信する送信ステップとを含み、
前記拡張現実感提示装置は、
前記障害修復作業に係る情報に基づいて、前記障害修復作業の方法を案内するコンピュータグラフィックス画像データを現実空間の映像と合成して提示する提示ステップを含み、
前記故障解析装置は、
前記拡張現実感提示装置によって提示された案内による障害修復作業の後、前記コンピュータシステムの稼働状況に係る情報を新たに取得した場合、当該稼働状況に係る情報に基づいて、前記コンピュータシステムに対する障害修復作業に係る情報を新たに判定し、新たに判定された障害修復作業に関する情報を前記拡張現実提示装置に対して送信することを特徴とする情報処理方法。 - 現実空間の映像とコンピュータグラフィックス画像データとを合成して表示させることが可能な拡張現実感提示装置と、コンピュータシステムに発生した障害を解析可能な故障解析装置とを有する情報処理システムにおける情報処理方法をコンピュータに実行させるためのプログラムであって、
前記故障解析装置が、前記コンピュータシステムの稼働状況に係る情報を取得する取得ステップと、
前記故障解析装置が、前記取得ステップにより取得された前記稼働状況に係る情報に基づいて、前記コンピュータシステムに対する障害修復作業に関する情報を判定する判定ステップと、
前記故障解析装置が、前記判定ステップにより判定された前記障害修復作業に関する情報を前記拡張現実感提示装置に対して送信する送信ステップと、
前記拡張現実感提示装置が、前記障害修復作業に関する情報に基づいて、前記障害修復作業の方法を案内するコンピュータグラフィックス画像データを現実空間の映像と合成して提示する提示ステップとを含み、
前記故障解析装置が、
前記拡張現実感提示装置によって提示された案内による障害修復作業の後、前記コンピュータシステムの稼働状況に係る情報が前記取得ステップによって新たに取得された場合、前記判定ステップにより、当該稼働状況に係る情報に基づいて、前記コンピュータシステムに対する障害修復作業に関する情報を新たに判定し、前記送信ステップにより、新たに判定された障害修復作業に関する情報を前記拡張現実提示装置に対して送信するステップをコンピュータに実行させるためのプログラム。 - 現実空間の映像とコンピュータグラフィックス画像データとを合成して表示させることが可能な拡張現実感提示装置と、修理対象物に発生した障害を解析可能な故障解析装置とを有する情報処理システムであって、
前記故障解析装置は、
前記修理対象物の稼働状況に係る情報を取得する取得手段と、
前記取得手段により取得された前記稼働状況に係る情報に基づいて、前記修理対象物に対する障害修復作業に関する情報を判定する判定手段と、
前記判定手段により判定された前記障害修復作業に関する情報を示す情報を前記拡張現実感提示装置に対して送信する送信手段とを有し、
前記拡張現実感提示装置は、
前記障害修復作業に関する情報に基づいて、前記障害修復作業の方法を案内するコンピュータグラフィックス画像データを現実空間の映像と合成して提示する提示手段を有し、
前記故障解析装置は、
前記拡張現実感提示装置によって提示された案内による障害修復作業の後、前記修理対象物の稼働状況に係る情報が前記取得手段によって新たに取得された場合、前記判定手段により、当該稼働状況に係る情報に基づいて、前記修理対象物に対する障害修復作業に関する情報を新たに判定し、前記送信手段により、新たに判定された障害修復作業に関する情報を前記拡張現実提示装置に対して送信することを特徴とする情報処理システム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011800037960A CN102483863B (zh) | 2010-04-28 | 2011-03-30 | 信息处理系统以及信息处理方法 |
US13/497,029 US8760471B2 (en) | 2010-04-28 | 2011-03-30 | Information processing system, information processing method and program for synthesizing and displaying an image |
EP11774752.7A EP2469475B1 (en) | 2010-04-28 | 2011-03-30 | Information processing system, information processing method, and program |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010104071 | 2010-04-28 | ||
JP2010-104071 | 2010-04-28 | ||
JP2011066198A JP4913913B2 (ja) | 2010-04-28 | 2011-03-24 | 情報処理システム、情報処理方法及びプログラム |
JP2011-066198 | 2011-03-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011135968A1 true WO2011135968A1 (ja) | 2011-11-03 |
Family
ID=44861286
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/057959 WO2011135968A1 (ja) | 2010-04-28 | 2011-03-30 | 情報処理システム、情報処理方法及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US8760471B2 (ja) |
EP (1) | EP2469475B1 (ja) |
JP (1) | JP4913913B2 (ja) |
CN (1) | CN102483863B (ja) |
WO (1) | WO2011135968A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012137108A1 (en) * | 2011-04-07 | 2012-10-11 | International Business Machines Corporation | Managing computing systems utilizing augmented reality |
CN103166778A (zh) * | 2011-12-13 | 2013-06-19 | 成都勤智数码科技有限公司 | 一种故障自动化智能处理方法及其装置 |
JP2017041209A (ja) * | 2015-08-21 | 2017-02-23 | 富士通株式会社 | タスク実行支援方法、タスク実行支援装置およびタスク実行支援プログラム |
JP2018151742A (ja) * | 2017-03-10 | 2018-09-27 | 株式会社デンソーウェーブ | 情報表示システム |
JP2019096324A (ja) * | 2017-11-27 | 2019-06-20 | オイヒナー ゲゼルシャフト ミット ベシュレンクテル ハフツング プルス コンパニー コマンディートゲゼルシャフトEUCHNER GmbH + Co. KG | 安全システム |
Families Citing this family (111)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102011017305A1 (de) * | 2011-04-15 | 2012-10-18 | Abb Technology Ag | Bedien- und Beobachtungssystem für technische Anlagen |
US9965564B2 (en) * | 2011-07-26 | 2018-05-08 | Schneider Electric It Corporation | Apparatus and method of displaying hardware status using augmented reality |
US9685000B1 (en) * | 2011-09-28 | 2017-06-20 | EMC IP Holding Company LLC | Using augmented reality in data storage management |
US9230501B1 (en) | 2012-01-06 | 2016-01-05 | Google Inc. | Device control utilizing optical flow |
US8952869B1 (en) * | 2012-01-06 | 2015-02-10 | Google Inc. | Determining correlated movements associated with movements caused by driving a vehicle |
US10469916B1 (en) | 2012-03-23 | 2019-11-05 | Google Llc | Providing media content to a wearable device |
US9413893B2 (en) * | 2012-04-05 | 2016-08-09 | Assurant, Inc. | System, method, apparatus, and computer program product for providing mobile device support services |
US8898520B1 (en) * | 2012-04-19 | 2014-11-25 | Sprint Communications Company L.P. | Method of assessing restart approach to minimize recovery time |
US8965624B2 (en) | 2012-08-14 | 2015-02-24 | Ebay Inc. | Method and system of vehicle tracking portal |
JP5966942B2 (ja) * | 2013-01-23 | 2016-08-10 | トヨタ自動車株式会社 | 操作支援システム及び操作支援方法 |
JP6065640B2 (ja) | 2013-02-21 | 2017-01-25 | ブラザー工業株式会社 | コンピュータプログラムおよび制御装置 |
US10678225B2 (en) | 2013-03-04 | 2020-06-09 | Fisher-Rosemount Systems, Inc. | Data analytic services for distributed industrial performance monitoring |
US10649424B2 (en) | 2013-03-04 | 2020-05-12 | Fisher-Rosemount Systems, Inc. | Distributed industrial performance monitoring and analytics |
US10223327B2 (en) | 2013-03-14 | 2019-03-05 | Fisher-Rosemount Systems, Inc. | Collecting and delivering data to a big data machine in a process control system |
US10282676B2 (en) | 2014-10-06 | 2019-05-07 | Fisher-Rosemount Systems, Inc. | Automatic signal processing-based learning in a process plant |
US10649449B2 (en) | 2013-03-04 | 2020-05-12 | Fisher-Rosemount Systems, Inc. | Distributed industrial performance monitoring and analytics |
US10909137B2 (en) | 2014-10-06 | 2021-02-02 | Fisher-Rosemount Systems, Inc. | Streaming data for analytics in process control systems |
US10386827B2 (en) | 2013-03-04 | 2019-08-20 | Fisher-Rosemount Systems, Inc. | Distributed industrial performance monitoring and analytics platform |
US9558220B2 (en) | 2013-03-04 | 2017-01-31 | Fisher-Rosemount Systems, Inc. | Big data in process control systems |
US9804588B2 (en) | 2014-03-14 | 2017-10-31 | Fisher-Rosemount Systems, Inc. | Determining associations and alignments of process elements and measurements in a process |
US10866952B2 (en) | 2013-03-04 | 2020-12-15 | Fisher-Rosemount Systems, Inc. | Source-independent queries in distributed industrial system |
US9823626B2 (en) | 2014-10-06 | 2017-11-21 | Fisher-Rosemount Systems, Inc. | Regional big data in process control systems |
US9397836B2 (en) | 2014-08-11 | 2016-07-19 | Fisher-Rosemount Systems, Inc. | Securing devices to process control systems |
US9665088B2 (en) | 2014-01-31 | 2017-05-30 | Fisher-Rosemount Systems, Inc. | Managing big data in process control systems |
US9959190B2 (en) * | 2013-03-12 | 2018-05-01 | International Business Machines Corporation | On-site visualization of component status |
US10691281B2 (en) | 2013-03-15 | 2020-06-23 | Fisher-Rosemount Systems, Inc. | Method and apparatus for controlling a process plant with location aware mobile control devices |
GB2513455B (en) * | 2013-03-15 | 2020-11-25 | Fisher Rosemount Systems Inc | Generating checklists in a process control environment |
EP3200131A1 (en) | 2013-03-15 | 2017-08-02 | Fisher-Rosemount Systems, Inc. | Data modeling studio |
EP2784681A1 (en) * | 2013-03-27 | 2014-10-01 | British Telecommunications public limited company | Visual diagnosis tool |
WO2014174426A1 (en) * | 2013-04-26 | 2014-10-30 | Koninklijke Philips N.V. | Decision support system for a lighting network |
WO2015001611A1 (ja) | 2013-07-02 | 2015-01-08 | 株式会社日立製作所 | ネットワーク構築支援システム及び方法 |
JP6355909B2 (ja) * | 2013-10-18 | 2018-07-11 | 三菱重工業株式会社 | 検査記録装置及び検査記録評価方法 |
US9740935B2 (en) * | 2013-11-26 | 2017-08-22 | Honeywell International Inc. | Maintenance assistant system |
EP3090507B1 (en) * | 2013-12-30 | 2020-02-12 | Telecom Italia S.p.A. | Augmented reality for supporting intervention of a network apparatus by a human operator |
US9495236B2 (en) * | 2014-04-07 | 2016-11-15 | Cubic Corporation | Intuitive visual assessment of device operational health |
JP6713731B2 (ja) * | 2014-06-05 | 2020-06-24 | トーヨーカネツ株式会社 | Ar利用広告提供システム及びar利用顧客付加価値情報提供システム |
JP6497851B2 (ja) * | 2014-06-05 | 2019-04-10 | トーヨーカネツソリューションズ株式会社 | Ar利用取説提供システム |
US10168691B2 (en) | 2014-10-06 | 2019-01-01 | Fisher-Rosemount Systems, Inc. | Data pipeline for process control system analytics |
JP2016138908A (ja) * | 2015-01-26 | 2016-08-04 | セイコーエプソン株式会社 | 表示システム、可搬型表示装置、表示制御装置、表示方法 |
JP6478713B2 (ja) * | 2015-03-04 | 2019-03-06 | キヤノン株式会社 | 計測装置および計測方法 |
WO2016147656A1 (ja) * | 2015-03-16 | 2016-09-22 | 日本電気株式会社 | 情報処理装置、情報処理方法、及び、記録媒体 |
JP6464889B2 (ja) * | 2015-03-31 | 2019-02-06 | 富士通株式会社 | 画像処理装置、画像処理プログラム、及び画像処理方法 |
MY201893A (en) | 2015-08-21 | 2024-03-22 | Ns Solutions Corp | Display system and information processing method |
CN107924585B (zh) * | 2015-08-25 | 2021-03-09 | 日铁系统集成株式会社 | 作业辅助装置、作业辅助方法以及存储介质 |
JP2017049763A (ja) * | 2015-09-01 | 2017-03-09 | 株式会社東芝 | 電子機器、支援システムおよび支援方法 |
JP2017049762A (ja) | 2015-09-01 | 2017-03-09 | 株式会社東芝 | システム及び方法 |
JP2017049869A (ja) | 2015-09-03 | 2017-03-09 | 株式会社東芝 | メガネ型ウェアラブル端末とそのデータ処理方法 |
JP2017058783A (ja) * | 2015-09-14 | 2017-03-23 | 富士ゼロックス株式会社 | 情報処理装置、情報処理方法、情報処理システム及びプログラム |
JP6670580B2 (ja) * | 2015-10-21 | 2020-03-25 | 株式会社シンコネクト | 建築分野用システム |
US10114717B2 (en) * | 2015-10-27 | 2018-10-30 | Fluke Corporation | System and method for utilizing machine-readable codes for testing a communication network |
US10528021B2 (en) | 2015-10-30 | 2020-01-07 | Rockwell Automation Technologies, Inc. | Automated creation of industrial dashboards and widgets |
CN106681881A (zh) * | 2015-11-05 | 2017-05-17 | 中兴通讯股份有限公司 | 一种数据中心巡检方法及装置 |
US10313281B2 (en) | 2016-01-04 | 2019-06-04 | Rockwell Automation Technologies, Inc. | Delivery of automated notifications by an industrial asset |
US10503483B2 (en) | 2016-02-12 | 2019-12-10 | Fisher-Rosemount Systems, Inc. | Rule builder in a process control network |
EP3417307A4 (en) * | 2016-02-18 | 2019-09-11 | Edx Technologies, Inc. | SYSTEMS AND METHODS FOR AUGMENTED REALITY DISPLAYS OF NETWORKS |
WO2017150292A1 (ja) * | 2016-03-04 | 2017-09-08 | 新日鉄住金ソリューションズ株式会社 | 表示システム、情報処理装置、情報処理方法及びプログラム |
WO2017150293A1 (ja) * | 2016-03-04 | 2017-09-08 | 新日鉄住金ソリューションズ株式会社 | 情報処理システム、情報処理装置、情報処理方法及びプログラム |
JP2017175354A (ja) * | 2016-03-23 | 2017-09-28 | Kddi株式会社 | システム、情報処理装置、ヘッドマウント装置、及びプログラム |
JP2017182487A (ja) * | 2016-03-30 | 2017-10-05 | Kddi株式会社 | システム、情報処理装置、ヘッドマウント装置、及び方法 |
ITUA20162756A1 (it) * | 2016-04-20 | 2017-10-20 | Newbiquity Sagl | Metodo e sistema di assistenza a distanza in tempo reale con impiego di computer vision e realtà aumentata |
WO2017182523A1 (en) * | 2016-04-20 | 2017-10-26 | Newbiquity Sagl | A method and a system for real-time remote support with use of computer vision and augmented reality |
KR20170121930A (ko) | 2016-04-26 | 2017-11-03 | 현대자동차주식회사 | 웨어러블 기기 및 이를 포함하는 차량 진단 장치 |
US10650591B1 (en) | 2016-05-24 | 2020-05-12 | Out of Sight Vision Systems LLC | Collision avoidance system for head mounted display utilized in room scale virtual reality system |
US10981060B1 (en) | 2016-05-24 | 2021-04-20 | Out of Sight Vision Systems LLC | Collision avoidance system for room scale virtual reality system |
JP6749144B2 (ja) * | 2016-05-31 | 2020-09-02 | 東芝テック株式会社 | 情報処理装置、及びプログラム |
KR102610021B1 (ko) | 2016-08-12 | 2023-12-04 | 매직 립, 인코포레이티드 | 단어 흐름 주석 |
US10318570B2 (en) | 2016-08-18 | 2019-06-11 | Rockwell Automation Technologies, Inc. | Multimodal search input for an industrial search platform |
JP6521923B2 (ja) * | 2016-09-21 | 2019-05-29 | 株式会社 日立産業制御ソリューションズ | 作業支援装置及び作業支援方法 |
US10545492B2 (en) | 2016-09-26 | 2020-01-28 | Rockwell Automation Technologies, Inc. | Selective online and offline access to searchable industrial automation data |
US10401839B2 (en) | 2016-09-26 | 2019-09-03 | Rockwell Automation Technologies, Inc. | Workflow tracking and identification using an industrial monitoring system |
US10319128B2 (en) | 2016-09-26 | 2019-06-11 | Rockwell Automation Technologies, Inc. | Augmented reality presentation of an industrial environment |
US10388075B2 (en) * | 2016-11-08 | 2019-08-20 | Rockwell Automation Technologies, Inc. | Virtual reality and augmented reality for industrial automation |
US10735691B2 (en) | 2016-11-08 | 2020-08-04 | Rockwell Automation Technologies, Inc. | Virtual reality and augmented reality for industrial automation |
US10866631B2 (en) | 2016-11-09 | 2020-12-15 | Rockwell Automation Technologies, Inc. | Methods, systems, apparatuses, and techniques for employing augmented reality and virtual reality |
JP2018092227A (ja) * | 2016-11-30 | 2018-06-14 | ソニー株式会社 | 提示制御装置、提示制御方法およびプログラム |
DE102017108622A1 (de) * | 2017-04-23 | 2018-10-25 | Goodly Innovations GmbH | System zur unterstützung von teamarbeit mittels augmented reality |
JP6867883B2 (ja) * | 2017-06-01 | 2021-05-12 | 株式会社日立ドキュメントソリューションズ | 空間表示装置、および、空間表示方法 |
CN107302454A (zh) * | 2017-06-16 | 2017-10-27 | 郑州云海信息技术有限公司 | 一种基于ar技术的机房服务器管理方法及系统 |
JP6917844B2 (ja) * | 2017-09-20 | 2021-08-11 | 株式会社東芝 | 作業支援システム、作業支援方法及び作業支援プログラム |
JP2019067050A (ja) * | 2017-09-29 | 2019-04-25 | 株式会社日立ビルシステム | 作業支援システム |
US10445944B2 (en) | 2017-11-13 | 2019-10-15 | Rockwell Automation Technologies, Inc. | Augmented reality safety automation zone system and method |
JP2018037107A (ja) * | 2017-11-27 | 2018-03-08 | 株式会社東芝 | システム及び方法 |
US11222081B2 (en) | 2017-11-27 | 2022-01-11 | Evoqua Water Technologies Llc | Off-line electronic documentation solutions |
US10796153B2 (en) | 2018-03-12 | 2020-10-06 | International Business Machines Corporation | System for maintenance and repair using augmented reality |
CN112219189A (zh) * | 2018-03-27 | 2021-01-12 | 奈飞公司 | 用于调度的反熵修复设计的技术 |
US20190042843A1 (en) * | 2018-04-05 | 2019-02-07 | Intel Corporation | Cable detection for ar/vr computing method and apparatus |
CN108919882A (zh) * | 2018-04-24 | 2018-11-30 | 北京拓盛智联技术有限公司 | 一种基于ar技术的设备主动安全操作方法和系统 |
CN108596349A (zh) * | 2018-04-28 | 2018-09-28 | 广东日月潭电源科技有限公司 | 一种基于ar技术的ups电源维修方法、系统及装置 |
JP7113702B2 (ja) * | 2018-08-31 | 2022-08-05 | 三菱電機株式会社 | 点検作業支援装置、点検作業支援システム、および点検作業支援方法 |
JP6816077B2 (ja) * | 2018-09-14 | 2021-01-20 | ファナック株式会社 | 誘導情報提示装置、誘導情報提示サーバ、誘導情報提示システム、誘導情報提示方法及びプログラム |
US10831588B2 (en) * | 2018-10-16 | 2020-11-10 | International Business Machines Corporation | Diagnosis of data center incidents with augmented reality and cognitive analytics |
US11145130B2 (en) * | 2018-11-30 | 2021-10-12 | Apprentice FS, Inc. | Method for automatically capturing data from non-networked production equipment |
US10966342B2 (en) * | 2019-01-31 | 2021-03-30 | Dell Products, L.P. | System and method for determining location and navigating a datacenter using augmented reality and available sensor data |
JP7058621B2 (ja) * | 2019-02-22 | 2022-04-22 | 株式会社日立製作所 | 映像記録装置およびヘッドマウントディスプレイ |
US11062264B2 (en) | 2019-03-08 | 2021-07-13 | Grace Technology, Inc. | Work support system, work support server, work situation determination apparatus, device for worker, and work object equipment |
JP7040488B2 (ja) * | 2019-03-28 | 2022-03-23 | オムロン株式会社 | トラブルシュート支援装置、トラブルシュート支援方法、およびプログラム |
JP7338200B2 (ja) * | 2019-03-29 | 2023-09-05 | 村田機械株式会社 | 保守方法、及び、保守サーバ |
JP7334460B2 (ja) * | 2019-04-26 | 2023-08-29 | 富士通株式会社 | 作業支援装置及び作業支援方法 |
WO2020261329A1 (ja) * | 2019-06-24 | 2020-12-30 | サン電子株式会社 | 機能実行システム |
US11842025B2 (en) * | 2019-08-06 | 2023-12-12 | Sony Group Corporation | Information processing device and information processing method |
US20210056272A1 (en) | 2019-08-23 | 2021-02-25 | KEFI Holdings, Inc | Object detection-based control of projected content |
US11792288B2 (en) * | 2019-09-09 | 2023-10-17 | Extreme Networks, Inc. | Wireless network device with directional communication functionality |
JP2022551978A (ja) | 2019-10-15 | 2022-12-14 | オラクル・インターナショナル・コーポレイション | データセンタオペレーションまたはクラウドインフラストラクチャで仮想現実または拡張現実を使用するためのシステムおよび方法 |
CN110826986A (zh) * | 2019-10-24 | 2020-02-21 | 广东优世联合控股集团股份有限公司 | 一种精准确定规建外包费用数据的处理方法和装置 |
JP2021067899A (ja) * | 2019-10-28 | 2021-04-30 | 株式会社日立製作所 | 頭部装着型表示装置および表示コンテンツ制御方法 |
JP7259717B2 (ja) * | 2019-11-28 | 2023-04-18 | 富士通株式会社 | 表示制御方法、表示制御プログラムおよび表示制御装置 |
US11074730B1 (en) * | 2020-01-23 | 2021-07-27 | Netapp, Inc. | Augmented reality diagnostic tool for data center nodes |
JP7149354B2 (ja) * | 2020-03-27 | 2022-10-06 | Sppテクノロジーズ株式会社 | 保守作業支援システム、保守作業支援方法、および、保守作業支援プログラム |
JP7044838B2 (ja) * | 2020-08-07 | 2022-03-30 | 東芝テック株式会社 | 情報処理装置、及びプログラム |
CN113778222A (zh) * | 2021-07-23 | 2021-12-10 | 中国人民解放军海军航空大学青岛校区 | 一种基于ar的特情处置方法及装置 |
CN114063909B (zh) * | 2021-10-25 | 2023-12-08 | 华东计算技术研究所(中国电子科技集团公司第三十二研究所) | 图片数据的智能分布式存储方法及系统 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002538543A (ja) * | 1999-03-02 | 2002-11-12 | シーメンス アクチエンゲゼルシヤフト | 強調現実感技術により対話を状況に応じて支援するためのシステム及び方法 |
WO2007096971A1 (ja) * | 2006-02-23 | 2007-08-30 | Fujitsu Limited | 保守ガイダンス表示装置、保守ガイダンス表示方法、保守ガイダンス表示プログラム |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4964077A (en) * | 1987-10-06 | 1990-10-16 | International Business Machines Corporation | Method for automatically adjusting help information displayed in an online interactive system |
US5680541A (en) * | 1991-12-16 | 1997-10-21 | Fuji Xerox Co., Ltd. | Diagnosing method and apparatus |
JP2000098019A (ja) | 1998-09-22 | 2000-04-07 | Honda Electronic Co Ltd | 超音波移動体位置検出装置 |
US7324081B2 (en) | 1999-03-02 | 2008-01-29 | Siemens Aktiengesellschaft | Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus |
US7013284B2 (en) * | 1999-05-04 | 2006-03-14 | Accenture Llp | Component based interface to handle tasks during claim processing |
US7126558B1 (en) * | 2001-10-19 | 2006-10-24 | Accenture Global Services Gmbh | Industrial augmented reality |
US20030115088A1 (en) * | 2001-12-18 | 2003-06-19 | Crossmark, Inc. | System and method of routing, scheduling, and monitoring a workforce |
JP2003298821A (ja) * | 2002-04-03 | 2003-10-17 | Sharp Corp | 画像出力装置 |
JP4396212B2 (ja) * | 2002-10-10 | 2010-01-13 | セイコーエプソン株式会社 | 作業担当者支援方法 |
EP1649419A4 (en) * | 2003-08-29 | 2007-04-25 | Siemens Med Solutions Health | CUSTOMER SUPPORT SYSTEM |
US7383148B2 (en) * | 2004-03-25 | 2008-06-03 | Siemens Building Technologies, Inc. | Method and apparatus for graphically displaying a building system |
JP2006099405A (ja) * | 2004-09-29 | 2006-04-13 | Seiko Epson Corp | コンテンツ配信システム、コンテンツ配信方法及びそのプログラム |
US7251585B2 (en) * | 2005-06-14 | 2007-07-31 | Siemens Aktiengesellschaft | Method and computer system for formulating potential measures for increasing the reliability of a technical system |
JP2007034712A (ja) * | 2005-07-27 | 2007-02-08 | Nec Fielding Ltd | 保守支援システム、保守支援方法、及び、保守支援プログラム |
US20080005617A1 (en) * | 2006-05-15 | 2008-01-03 | The Boeing Company | Automated processing of electronic log book pilot reports for ground-based fault processing |
US8364514B2 (en) * | 2006-06-27 | 2013-01-29 | Microsoft Corporation | Monitoring group activities |
JP2008111588A (ja) * | 2006-10-30 | 2008-05-15 | Fujitsu Ltd | 空調設備およびコンピュータシステム |
US7760094B1 (en) * | 2006-12-14 | 2010-07-20 | Corning Cable Systems Llc | RFID systems and methods for optical fiber network deployment and maintenance |
JP2008158684A (ja) * | 2006-12-21 | 2008-07-10 | Konica Minolta Business Technologies Inc | 画像監視システム、画像監視方法および画像監視プログラム |
JP2008201101A (ja) | 2007-02-22 | 2008-09-04 | Kyocera Mita Corp | 操作案内表示装置 |
US7696879B2 (en) * | 2007-12-18 | 2010-04-13 | Alcatel Lucent | Ascertaining physical routing of cabling interconnects |
CN101971141A (zh) * | 2008-01-31 | 2011-02-09 | 自适应计算企业股份有限公司 | 用于管理混合计算环境的系统和方法 |
GB0805596D0 (en) * | 2008-03-27 | 2008-04-30 | British Telecomm | Tagged cable |
US7940182B2 (en) * | 2008-04-30 | 2011-05-10 | Alcatel Lucent | RFID encoding for identifying system interconnect cables |
US20120242698A1 (en) * | 2010-02-28 | 2012-09-27 | Osterhout Group, Inc. | See-through near-eye display glasses with a multi-segment processor-controlled optical layer |
EP2592564A1 (en) * | 2010-07-05 | 2013-05-15 | Nec Corporation | Service provision device for electronic documents, service provision method for electronic documents, and service provision terminal for electronic documents |
WO2012142250A1 (en) * | 2011-04-12 | 2012-10-18 | Radiation Monitoring Devices, Inc. | Augumented reality system |
-
2011
- 2011-03-24 JP JP2011066198A patent/JP4913913B2/ja active Active
- 2011-03-30 EP EP11774752.7A patent/EP2469475B1/en active Active
- 2011-03-30 WO PCT/JP2011/057959 patent/WO2011135968A1/ja active Application Filing
- 2011-03-30 US US13/497,029 patent/US8760471B2/en active Active
- 2011-03-30 CN CN2011800037960A patent/CN102483863B/zh active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002538543A (ja) * | 1999-03-02 | 2002-11-12 | シーメンス アクチエンゲゼルシヤフト | 強調現実感技術により対話を状況に応じて支援するためのシステム及び方法 |
WO2007096971A1 (ja) * | 2006-02-23 | 2007-08-30 | Fujitsu Limited | 保守ガイダンス表示装置、保守ガイダンス表示方法、保守ガイダンス表示プログラム |
Non-Patent Citations (2)
Title |
---|
HIROTAKE ISHII: "Application 2 : Plant Maintenance Support", JOHO SHORI, vol. 51, no. 4, 15 April 2010 (2010-04-15), pages 392 - 397, XP008151100 * |
See also references of EP2469475A4 * |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9219665B2 (en) | 2011-04-07 | 2015-12-22 | International Business Machines Corporation | Systems and methods for managing computing systems utilizing augmented reality |
WO2012137108A1 (en) * | 2011-04-07 | 2012-10-11 | International Business Machines Corporation | Managing computing systems utilizing augmented reality |
GB2505099A (en) * | 2011-04-07 | 2014-02-19 | Ibm | Managing computing systems utilizing augmented reality |
US8738754B2 (en) | 2011-04-07 | 2014-05-27 | International Business Machines Corporation | Systems and methods for managing computing systems utilizing augmented reality |
US8918494B2 (en) | 2011-04-07 | 2014-12-23 | International Business Machines Corporation | Systems and methods for managing computing systems utilizing augmented reality |
US8990385B2 (en) | 2011-04-07 | 2015-03-24 | International Business Machines Corporation | Systems and methods for managing computing systems utilizing augmented reality |
US9391860B2 (en) | 2011-04-07 | 2016-07-12 | Globalfoundries, Inc.. | Systems and methods for managing computing systems utilizing augmented reality |
US9219666B2 (en) | 2011-04-07 | 2015-12-22 | International Business Machines Corporation | Systems and methods for managing computing systems utilizing augmented reality |
US9712413B2 (en) | 2011-04-07 | 2017-07-18 | Globalfoundries Inc. | Systems and methods for managing computing systems utilizing augmented reality |
CN103166778A (zh) * | 2011-12-13 | 2013-06-19 | 成都勤智数码科技有限公司 | 一种故障自动化智能处理方法及其装置 |
JP2017041209A (ja) * | 2015-08-21 | 2017-02-23 | 富士通株式会社 | タスク実行支援方法、タスク実行支援装置およびタスク実行支援プログラム |
JP2018151742A (ja) * | 2017-03-10 | 2018-09-27 | 株式会社デンソーウェーブ | 情報表示システム |
JP2019096324A (ja) * | 2017-11-27 | 2019-06-20 | オイヒナー ゲゼルシャフト ミット ベシュレンクテル ハフツング プルス コンパニー コマンディートゲゼルシャフトEUCHNER GmbH + Co. KG | 安全システム |
JP7446709B2 (ja) | 2017-11-27 | 2024-03-11 | オイヒナー ゲゼルシャフト ミット ベシュレンクテル ハフツング プルス コンパニー コマンディートゲゼルシャフト | 安全システム |
Also Published As
Publication number | Publication date |
---|---|
EP2469475A1 (en) | 2012-06-27 |
EP2469475B1 (en) | 2017-08-30 |
JP4913913B2 (ja) | 2012-04-11 |
CN102483863B (zh) | 2013-11-20 |
JP2011248860A (ja) | 2011-12-08 |
US20130120449A1 (en) | 2013-05-16 |
US8760471B2 (en) | 2014-06-24 |
EP2469475A4 (en) | 2013-03-13 |
CN102483863A (zh) | 2012-05-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4913913B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
US9746913B2 (en) | Secured mobile maintenance and operator system including wearable augmented reality interface, voice command interface, and visual recognition systems and related methods | |
US8659635B2 (en) | Information processing system and information processing method | |
EP3659132B1 (en) | Position-based location indication and device control | |
JP6004051B2 (ja) | 情報処理システム、その制御方法、及びプログラム、並びに、情報処理装置、その制御方法、及びプログラム | |
KR20110097305A (ko) | 증강 현실을 이용한 사용자 설명서 제공 시스템 및 방법 | |
JP6711033B2 (ja) | 表示制御方法、通信装置、表示制御プログラムおよび表示制御装置 | |
EP3822923A1 (en) | Maintenance assistance system, maintenance assistance method, program, method for generating processed image, and processed image | |
US10915754B2 (en) | System and method for use of augmented reality in outfitting a dynamic structural space | |
US20160364396A1 (en) | Information search system and information search method | |
JP7109395B2 (ja) | 作業支援システム、作業支援装置、及び作業支援方法 | |
US20210302945A1 (en) | Information processing apparatus, information processing system, and non-transitory computer readable medium storing program | |
US20200242797A1 (en) | Augmented reality location and display using a user-aligned fiducial marker | |
US20220012868A1 (en) | Maintenance support system, maintenance support method, program, method for generating processed image, and processed image | |
JP7149354B2 (ja) | 保守作業支援システム、保守作業支援方法、および、保守作業支援プログラム | |
WO2024034416A1 (ja) | 支援装置、支援システムおよび支援方法 | |
US12002162B2 (en) | Method and apparatus for providing virtual contents in virtual space based on common coordinate system | |
JP2023044464A (ja) | 映像表示システム、映像表示装置、および映像表示方法 | |
KR20210085929A (ko) | 다중 사용자 간의 증강현실 커뮤니케이션 방법 | |
CN113383371A (zh) | 在基于共同坐标系的虚拟空间的虚拟内容提供方法及装置 | |
JP2020170482A (ja) | 作業指示システム | |
JP2006065517A (ja) | 実空間情報付加装置、方法及びプログラム | |
JP2020198001A (ja) | 作業支援システム、作業支援方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180003796.0 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11774752 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13497029 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2011774752 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011774752 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1201002794 Country of ref document: TH |
|
NENP | Non-entry into the national phase |
Ref country code: DE |