WO2023223388A1 - Dispositif de surveillance de travail, procédé de surveillance de travail, programme de surveillance de travail, système d'entraînement de travail et procédé d'entraînement de travail - Google Patents

Dispositif de surveillance de travail, procédé de surveillance de travail, programme de surveillance de travail, système d'entraînement de travail et procédé d'entraînement de travail Download PDF

Info

Publication number
WO2023223388A1
WO2023223388A1 PCT/JP2022/020403 JP2022020403W WO2023223388A1 WO 2023223388 A1 WO2023223388 A1 WO 2023223388A1 JP 2022020403 W JP2022020403 W JP 2022020403W WO 2023223388 A1 WO2023223388 A1 WO 2023223388A1
Authority
WO
WIPO (PCT)
Prior art keywords
work
worker
positional relationship
tool
action
Prior art date
Application number
PCT/JP2022/020403
Other languages
English (en)
Japanese (ja)
Inventor
渉 伏見
貴耶 谷口
敬士 西川
健二 瀧井
Original Assignee
三菱電機株式会社
三菱電機ビルソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社, 三菱電機ビルソリューションズ株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2022/020403 priority Critical patent/WO2023223388A1/fr
Priority to JP2022571865A priority patent/JPWO2023223388A1/ja
Publication of WO2023223388A1 publication Critical patent/WO2023223388A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons

Definitions

  • the present disclosure relates to technology for realizing safe work.
  • Patent Document 1 discloses an MR system that predicts a collision between an operating tool and a real object based on position information of the operating tool and position information of the real object. If it is predicted that the operating tool will collide with the real object, it is determined whether the collision will damage the operating tool or the real object. If a collision is determined to cause damage to the operating tool or real object, a warning is presented to the user. This warning makes the user aware of the danger of a collision and avoids the collision. Further, if a collision that does not cause damage to the operating tool or the physical object is predicted, no warning is presented, which reduces the annoyance caused by frequent warnings.
  • MR is an abbreviation for Mixed Reality.
  • the operation tool is a rod-shaped tool held in the user's hand to cut a virtual object.
  • the virtual object is cut and a cross section of the virtual object can be observed. If a user mistakes a real object for a virtual object, the operating tool collides with the real object, causing damage to the operating tool or the real object. If the manipulation tool is about to collide with a real object at a speed above a threshold, a warning is given to the user.
  • unique movements such as circular movements of the arm while holding an operating tool are not taken into consideration. Furthermore, no consideration is given to the risk that when a force is applied to the operating tool placed on the object, the operating tool may come off the object and the user's balance may be lost.
  • Patent Document 1 a collision is predicted based on a change in the position of the operating tool after the start of the operation. Therefore, it is not possible to issue a warning before the operation starts.
  • the present disclosure aims to enable notification to workers before the start of unsafe operations.
  • the work monitoring device of the present disclosure includes: a position information acquisition unit that acquires position information of the tool, an object acted upon by the tool, and an obstacle to the work by analyzing a video showing a work performed by a worker using the tool; and, The tool, the effector, and the obstacle are determined based on the obtained positional information by referring to the cause operation data that includes positional relationship information indicating the positional relationship between the tool, the agent, and the obstacle in the cause operation, which is an operation that leads to an unsafe operation.
  • the apparatus further includes a notification unit that notifies the worker using an output device when the action of the worker is determined to be the initiating action.
  • a worker is notified when an operation that leads to an unsafe operation is performed. In other words, it is possible to notify the worker before starting an unsafe operation.
  • FIG. 1 is a configuration diagram of a work training system 100 in Embodiment 1.
  • FIG. FIG. 2 is a configuration diagram of a storage unit 290 in the first embodiment. 1 is a flowchart of a work monitoring method in Embodiment 1.
  • FIG. 3 is a configuration diagram of cause action data 120 in the first embodiment.
  • FIG. 7 is a diagram for explaining an example of a cause operation in the first embodiment.
  • FIG. 2 is a configuration diagram of cause action data 120 (Example 1) in Embodiment 1.
  • FIG. FIG. 3 is a configuration diagram of the cause operation data 120 (Example 2) in the first embodiment.
  • FIG. 7 is a configuration diagram of the cause operation data 120 (Example 3) in the first embodiment.
  • FIG. 2 is a configuration diagram of a storage unit 290 in Embodiment 2.
  • 7 is a flowchart of a work monitoring method in Embodiment 2.
  • FIG. 7 is a configuration diagram of a work training system 100 in Embodiment 3.
  • 12 is a flowchart of a method for recording a causal action in Embodiment 3.
  • FIG. 2 is a hardware configuration diagram of a work monitoring device 200 in an embodiment.
  • Embodiment 1 The work training system 100 will be explained based on FIGS. 1 to 8.
  • the configuration of the work training system 100 will be explained based on FIG. 1.
  • the work training system 100 includes a work monitoring device 200 and an input/output device.
  • the work training system 100 includes MR glasses 110 as an input/output device.
  • the work training system 100 includes a speaker 113 as an output device.
  • MR glasses 110 include a camera 111 and a display 112.
  • MR glasses 110 are devices for showing MR to the user, and display MR images on display 112.
  • An MR image is a composite image of a real object and a virtual object.
  • MR is an abbreviation for mixed reality.
  • the work monitoring device 200 is a computer that includes hardware such as a processor 201, a memory 202, an auxiliary storage device 203, a communication device 204, and an input/output interface 205. These pieces of hardware are connected to each other via signal lines.
  • the processor 201 is an IC that performs arithmetic processing and controls other hardware.
  • processor 201 is a CPU.
  • IC is an abbreviation for Integrated Circuit.
  • CPU is an abbreviation for Central Processing Unit.
  • Memory 202 is a volatile or non-volatile storage device. Memory 202 is also called main storage or main memory. For example, memory 202 is a RAM. The data stored in memory 202 is stored in auxiliary storage device 203 as needed. RAM is an abbreviation for Random Access Memory.
  • the auxiliary storage device 203 is a nonvolatile storage device.
  • the auxiliary storage device 203 is a ROM, an HDD, a flash memory, or a combination thereof. Data stored in auxiliary storage device 203 is loaded into memory 202 as needed.
  • ROM is an abbreviation for Read Only Memory.
  • HDD is an abbreviation for Hard Disk Drive.
  • Communication device 204 is a receiver and transmitter.
  • communication device 204 is a communication chip or NIC.
  • Communication between the work monitoring device 200 is performed using a communication device 204.
  • NIC is an abbreviation for Network Interface Card.
  • the input/output interface 205 is a port to which an input device and an output device are connected.
  • the input/output interface 205 is a USB terminal
  • the input device is a keyboard and a mouse
  • the output device is a display.
  • MR glasses 110 and speakers 113 are connected to input/output interface 205.
  • MR glasses 110 and speaker 113 may be connected to communication device 204 via communication.
  • Input/output of the work monitoring device 200 is performed using an input/output interface 205.
  • USB is an abbreviation for Universal Serial Bus.
  • the work monitoring device 200 includes elements such as a position information acquisition section 210, a motion determination section 220, a notification section 230, a work instruction section 281, and a video control section 282. These elements are implemented in software.
  • the auxiliary storage device 203 stores a work monitoring program for causing the computer to function as a position information acquisition section 210, a motion determination section 220, a notification section 230, a work instruction section 281, and a video control section 282.
  • the work monitoring program is loaded into memory 202 and executed by processor 201.
  • the auxiliary storage device 203 further stores an OS. At least a portion of the OS is loaded into memory 202 and executed by processor 201.
  • the processor 201 executes the work monitoring program while executing the OS.
  • OS is an abbreviation for Operating System.
  • Input/output data of the work monitoring program is stored in the storage unit 290.
  • Memory 202 functions as storage section 290.
  • storage devices such as the auxiliary storage device 203, a register in the processor 201, and a cache memory in the processor 201 may function as the storage unit 290 instead of the memory 202 or together with the memory 202.
  • the work monitoring device 200 may include a plurality of processors that replace the processor 201.
  • the work monitoring program can be recorded (stored) in a computer-readable manner on a non-volatile recording medium such as an optical disk or a flash memory.
  • the configuration of the storage unit 290 will be explained based on FIG. 2.
  • the storage unit 290 stores in advance a cause action database 291, work instruction data 292, virtual object data 293, and the like. These data will be described later.
  • the operation procedure of the work training system 100 corresponds to a work monitoring method.
  • the operation procedure of the work monitoring device 200 corresponds to a work monitoring method. Further, the operation procedure of the work monitoring device 200 corresponds to the processing procedure by the work monitoring program.
  • the work training method will be explained based on FIG. 3.
  • work by workers is monitored.
  • a worker performs work using a tool.
  • a specific example of the tool is a spanner.
  • An object that receives an action from a tool is called an "acting object.”
  • a specific example of an agent is a bolt.
  • Objects that get in the way of work are called "obstacles.”
  • MR glasses 110 have a camera 111 and a display 112.
  • step S101 the camera 111 photographs the work and outputs a work video.
  • the video control unit 282 receives the work video and generates an MR image using the virtual object data 293.
  • An MR image is an image in which a virtual object is superimposed on the real world. Real objects and virtual objects are visible in MR images. Virtual objects are represented by images. A specific example of a real object is a tool. Examples of virtual objects are agents and obstacles. However, either the action object or the obstacle may be a real object. Furthermore, the tools, effects, and obstacles may be virtual objects. Virtual object data 293 indicates information about a virtual object.
  • the image control unit 282 transmits an image of the virtual object aligned on the real world to the MR glasses 110, thereby superimposing the virtual object on the real world that can be seen through the display 112 to display an MR image.
  • a display 112 receives and displays MR images. The worker views an MR image in which the real world and the virtual object are superimposed via the display 112.
  • the camera 111 continues to take pictures and outputs work images in real time.
  • the MR image is then updated in accordance with the work video.
  • the work instruction unit 281 refers to the work instruction data 292 and instructs the worker to perform the work.
  • Work instruction data 292 indicates the contents of instructions for each of one or more tasks instructed to the worker.
  • Work instructions are given using an output device.
  • the work instruction unit 281 outputs a sound from the speaker 113 that conveys the contents of the instruction.
  • step S121 the position information acquisition unit 210 acquires position information of each work element by analyzing the MR image.
  • a work element is an element for work. Specific work elements are tools, agents and obstacles.
  • the position information indicates the position of the work element and the orientation of the work element.
  • the motion determining unit 220 refers to the causal motion database 291 and determines whether the worker's motion is a causal motion based on the position information of each work element.
  • the causative action is an action that leads to an unsafe action.
  • An unsafe operation is an unsafe operation.
  • causal action data 120 are registered in the causal action database 291 for each task.
  • the causal action data 120 indicates information that specifies the causal action.
  • the cause action data 120 shows a work name 121 and positional relationship information 122.
  • the work name 121 identifies the work.
  • the positional relationship information 122 indicates the positional relationship of the work elements and the orientation relationship of the work elements. Specifically, the positional relationship information 122 indicates the relationship between the tool, the working object, and the obstacle.
  • the work performed by the worker is to loosen the bolt 132 using a spanner 131. Wrench 131, bolt 132 and obstacle 133 are working elements.
  • the position and direction of the spanner 131 applied to the bolt 132 are different. In (1), even if the spanner 131 comes off the bolt 132 when the spanner 131 is moved in the direction of loosening the bolt 132, the hand holding the spanner 131 does not hit the obstacle 133.
  • the spanner 131 comes off the bolt 132 when it is moved in the direction of loosening the bolt 132, the hand holding the spanner 131 may hit the obstacle 133. Therefore, the operation of moving the spanner 131 in the direction of loosening the bolt 132 in (2) corresponds to an unsafe operation.
  • the action of applying the spanner 131 to the bolt 132 as shown in (2) corresponds to the initiating action.
  • step S122 Whether or not the operator's action is the initiating action is determined as follows. First, the motion determination unit 220 extracts the cause motion data 120 including the work name 121 of the instructed work from the cause motion database 291 . Next, the motion determination unit 220 calculates the positional relationship of the work elements based on the position information of the work elements. Then, the motion determining unit 220 determines whether the calculated positional relationship matches the positional relationship indicated in the positional relationship information 122 for each extracted cause motion data 120. If the calculated positional relationship matches the positional relationship indicated by any of the positional relationship information 122, the worker's action is the initiating action. If the calculated positional relationship does not match the positional relationship indicated by any of the positional relationship information 122, the operator's action is not the initiating action.
  • step S123 If it is determined that the worker's action is the initiating action, the process proceeds to step S123. If it is determined that the worker's action is not the initiating action, the process proceeds to step S131.
  • step S123 the notification unit 230 uses the output device to notify the worker. Specifically, the notification unit 230 issues a warning to the worker. For example, the notification unit 230 outputs a warning sound from the speaker 113. After step S123, the process proceeds to step S131.
  • step S131 the work instruction unit 281 determines whether the instructed work has been completed.
  • the work instruction unit 281 refers to the work instruction data 292 and makes the determination by analyzing the MR image.
  • step S132 If the instructed work is completed, the process advances to step S132. If the instructed work has not been completed, the process advances to step S121.
  • step S132 the work instruction unit 281 determines whether all work instructions shown in the work instruction data 292 have been completed. When all work instructions are completed, the process ends. If there is any work that has not been instructed, the process advances to step S111.
  • actions that lead to unsafe actions are defined in advance according to the contents of work instructions. This makes it possible to take into account the specific movements performed with the tool and the risks of the specific movements. Then, when the worker performs a defined action, a warning of unsafe action is issued. In other words, a warning of unsafe operation can be issued before the unsafe operation has occurred. This allows workers to know actions that lead to unsafe operations.
  • step S121 the position information acquisition unit 210 acquires position information of each work element including the body parts of the worker.
  • FIG. 6 shows the configuration of the cause action data 120.
  • the positional relationship information 122 indicates the relationships among tools, working objects, obstacles, and parts of the human body. According to the first embodiment, the causing motion can be detected more accurately.
  • the position information acquisition unit 210 acquires position information of each work element including the worker.
  • the worker's position information indicates the worker's standing position.
  • FIG. 7 shows the configuration of the cause action data 120.
  • the cause action data 120 further indicates position range information 123.
  • the position range information 123 indicates the range of the worker's standing position.
  • Each piece of attributable action data 120 for the instructed work is referred to as target attributable action data.
  • the motion determining unit 220 determines whether the standing position of the worker is included in the range indicated by the position range information 123 based on the position information of the worker for each object-induced motion data.
  • the motion determination unit 220 determines whether any of the object-induced motion data is the corresponding caused motion data. If any of the object-induced action data is the corresponding caused-action data, the worker's action is the caused action.
  • the corresponding action data is first object-induced action data and second object-induced action data.
  • the first object-induced motion data is object-induced motion data that satisfies the condition that the positional relationship of the work elements matches the positional relationship shown in the positional relationship information 122.
  • the second object-induced motion data is object-induced motion data that satisfies the condition that the worker's standing position is included in the range indicated by the position range information 123.
  • a warning can be issued for actions that may cause injury to the worker, depending on the location where the worker is standing (work location), even if there are no obstacles.
  • it is possible to consider not only the danger to the hand holding the tool, but also the danger caused by losing one's posture. Also, secondary risks (risks to other people or other equipment) in the event that the tool falls can also be taken into account.
  • the position information acquisition unit 210 acquires position information of each work element including the worker.
  • the worker's location information indicates the worker's location.
  • FIG. 8 shows the configuration of the cause action data 120.
  • the cause motion data 120 further indicates operation direction information 124.
  • the operating direction information 124 indicates the operating direction of the tool.
  • the operating direction of the tool means the direction in which the tool is moved.
  • Each piece of caused motion data 120 for the instructed work is referred to as target caused motion data.
  • the motion determination unit 220 predicts the operating direction of the tool based on the position information of the work element.
  • the predicted direction of operation is referred to as a predicted direction.
  • the motion determination unit 220 determines whether any of the object-induced motion data is the corresponding caused motion data. If any of the object-induced action data is the corresponding caused-action data, the worker's action is the caused action.
  • the corresponding action data is the first object action data and the second object action data.
  • the first object-induced motion data is object-induced motion data that satisfies the condition that the positional relationship of the work elements matches the positional relationship shown in the positional relationship information 122.
  • the second object-induced motion data is object-induced motion data that satisfies the condition that the predicted direction matches the operation direction shown in the operation direction information 124.
  • a warning can be issued for an operation that may cause injury to the operator, according to the characteristics of the tool (direction in which it is operated) and the positional relationship of the operator. For example, a warning can be issued if the cutter is moved towards the operator. In other words, the danger to the worker himself can be taken into consideration.
  • the following effects can be obtained, for example.
  • a warning can be issued for an action that moves the tool in a direction that poses a risk of losing his balance. Further, even if there is no risk of losing one's position, a warning can be issued for an action that moves the tool in a direction that poses a risk of dropping the tool.
  • Work monitoring device 200 may be used in a system where MR glasses 110 are not used.
  • the work monitoring device 200 may be used in a system for monitoring actual work instead of an educational system.
  • the camera 111 may be attached to the worker or may be installed at one or more locations in the work place.
  • the work monitoring device 200 may generate an MR video instead of an MR image.
  • the MR image is a working image on which a virtual object is superimposed. In other words, an MR image is an image generated by superimposing an image of a virtual object on an image of the real world.
  • Embodiment 2 Regarding the format in which guidance is output instead of a warning, the main differences from Embodiment 1 will be explained based on FIGS. 9 and 10.
  • the configuration of the storage unit 290 will be explained based on FIG. 9.
  • the storage unit 290 further stores guidance data 294 in advance.
  • the guidance data 294 indicates guidance for performing the work while avoiding the causing motion for each work.
  • Steps S101 to S122, S131, and S132 are as described in the first embodiment.
  • Step S223 corresponds to step S123 in the first embodiment.
  • the notification unit 230 refers to the guidance data 294 and outputs guidance to the worker. Specifically, the notification unit 230 outputs the guidance for the instructed work among the guidance shown in the guidance data 294. The guidance is output using an output device. For example, the notification unit 230 outputs guidance audio from the speaker 113.
  • Embodiment 2 when an action that leads to an unsafe action is detected, guidance can be presented so that the action is not performed. This allows the operator to prevent unsafe operations and learn how to deal with them.
  • Embodiment 3 Regarding the form in which the cause action data 120 is generated, the main differences from Embodiment 1 will be explained based on FIGS. 11 and 12.
  • the configuration of the work training system 100 will be explained based on FIG. 11.
  • the configuration of work training system 100 is similar to the configuration in the first embodiment.
  • the configuration of work monitoring device 200 is different from the configuration in the first embodiment.
  • the work monitoring device 200 further includes an element called a cause action recording section 240.
  • the cause action recording unit 240 is realized by software.
  • the work monitoring program further causes the computer to function as the cause action recording unit 240.
  • the procedure of the causal action recording method will be explained based on FIG. 12.
  • the causative action recording process is part of the work monitoring method.
  • the operator corresponds to the operator in Embodiment 1, and intentionally performs the initiating action in each task.
  • Steps S301 to S322 are the same as steps S101 to S122 in the first embodiment.
  • step S322 If it is determined in step S322 that the action of the operator is the initiating action, the process proceeds to step S331. If it is determined in step S322 that the action of the operator is not the initiating action, the process proceeds to step S323.
  • step S323 the cause action recording unit 240 calculates the positional relationship of the work elements based on the position information of the work elements, generates positional relationship information 122 indicating the calculated positional relationship, and generates positional relationship information 122 indicating the calculated positional relationship.
  • the cause action data 120 including the following is generated.
  • the cause action recording unit 240 adds the generated cause action data 120 to the cause action database 291.
  • step S331 the cause action recording unit 240 calculates the positional relationship of the work elements based on the position information of the work elements, generates positional relationship information 122 indicating the calculated positional relationship, and generates positional relationship information 122 indicating the calculated positional relationship.
  • the cause action data 120 including the following is generated.
  • the cause action recording unit 240 adds the generated cause action data 120 to the cause action database 291.
  • Step S331 and step S332 are the same as step S131 and step S132 in the first embodiment.
  • the notification unit 230 may notify the operator. Then, if the notification is not issued, the operator may instruct the work monitoring device 200 to generate the cause action data 120.
  • Embodiment 3 when a worker acting as a teacher intentionally performs an action that leads to an unsafe action, but that action is not detected as an action that leads to an unsafe action, the action is changed to an action that leads to an unsafe action. It can be recorded as This makes it easy to create definition data for actions that lead to unsafe actions.
  • Embodiment 3 may be implemented in combination with Embodiment 2. That is, in the third embodiment, guidance may be output instead of a warning.
  • the work monitoring device 200 includes a processing circuit 209 .
  • the processing circuit 209 is hardware that implements the position information acquisition section 210, the motion determination section 220, the notification section 230, the cause motion recording section 240, the work instruction section 281, and the video control section 282.
  • the processing circuit 209 may be dedicated hardware or may be the processor 201 that executes a program stored in the memory 202.
  • processing circuit 209 is dedicated hardware, processing circuit 209 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC, an FPGA, or a combination thereof.
  • ASIC is an abbreviation for Application Specific Integrated Circuit.
  • FPGA is an abbreviation for Field Programmable Gate Array.
  • the work monitoring device 200 may include a plurality of processing circuits that replace the processing circuit 209.
  • processing circuit 209 some functions may be realized by dedicated hardware, and the remaining functions may be realized by software or firmware.
  • the functions of the work monitoring device 200 can be realized by hardware, software, firmware, or a combination thereof.
  • the "unit" of each element of the work monitoring device 200 may be read as “process”, “process”, “circuit”, or “circuitry”.
  • 100 Work education system 110 MR glasses, 111 Camera, 112 Display, 113 Speaker, 120 Cause operation data, 121 Work name, 122 Position relationship information, 123 Position range information, 124 Operation direction information, 131 Spanner, 132 Volt, 133 Failure Object, 200 Work monitoring device, 201 Processor, 202 Memory, 203 Auxiliary storage device, 204 Communication device, 205 Input/output interface, 209 Processing circuit, 210 Location information acquisition unit, 220 Operation determination unit, 230 Notification unit, 240 Cause action record section, 281 work instruction section, 282 video control section, 290 storage section, 291 cause action database, 292 work instruction data, 293 virtual object data, 294 guidance data.

Landscapes

  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Factory Administration (AREA)

Abstract

La présente invention concerne une unité d'acquisition d'informations de position (210) acquérant des informations de position de chacun d'un outil, d'un objet de travail et d'un obstacle. Une unité de détermination de mouvement (220) se réfère à des données de mouvement causales comprenant des informations de relation de position d'un mouvement causal, qui est un mouvement conduisant à un mouvement dangereux, et détermine qu'un mouvement du travailleur est le mouvement causal si la relation de position sur la base des informations de position acquises correspond à la relation de position indiquée par les informations de relation de position. Une unité de notification (230) notifie au travailleur s'il est déterminé que le mouvement du travailleur est le mouvement causal.
PCT/JP2022/020403 2022-05-16 2022-05-16 Dispositif de surveillance de travail, procédé de surveillance de travail, programme de surveillance de travail, système d'entraînement de travail et procédé d'entraînement de travail WO2023223388A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2022/020403 WO2023223388A1 (fr) 2022-05-16 2022-05-16 Dispositif de surveillance de travail, procédé de surveillance de travail, programme de surveillance de travail, système d'entraînement de travail et procédé d'entraînement de travail
JP2022571865A JPWO2023223388A1 (fr) 2022-05-16 2022-05-16

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/020403 WO2023223388A1 (fr) 2022-05-16 2022-05-16 Dispositif de surveillance de travail, procédé de surveillance de travail, programme de surveillance de travail, système d'entraînement de travail et procédé d'entraînement de travail

Publications (1)

Publication Number Publication Date
WO2023223388A1 true WO2023223388A1 (fr) 2023-11-23

Family

ID=88834807

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/020403 WO2023223388A1 (fr) 2022-05-16 2022-05-16 Dispositif de surveillance de travail, procédé de surveillance de travail, programme de surveillance de travail, système d'entraînement de travail et procédé d'entraînement de travail

Country Status (2)

Country Link
JP (1) JPWO2023223388A1 (fr)
WO (1) WO2023223388A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016058042A (ja) * 2014-09-12 2016-04-21 キヤノン株式会社 情報処理装置、情報処理方法、プログラム
JP2019070909A (ja) * 2017-10-06 2019-05-09 キヤノン株式会社 情報処理装置、情報処理方法及びプログラム
JP2020004218A (ja) * 2018-06-29 2020-01-09 株式会社日立システムズ コンテンツ提示システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016058042A (ja) * 2014-09-12 2016-04-21 キヤノン株式会社 情報処理装置、情報処理方法、プログラム
JP2019070909A (ja) * 2017-10-06 2019-05-09 キヤノン株式会社 情報処理装置、情報処理方法及びプログラム
JP2020004218A (ja) * 2018-06-29 2020-01-09 株式会社日立システムズ コンテンツ提示システム

Also Published As

Publication number Publication date
JPWO2023223388A1 (fr) 2023-11-23

Similar Documents

Publication Publication Date Title
US20190329411A1 (en) Simulation device for robot
JP2019070909A (ja) 情報処理装置、情報処理方法及びプログラム
US8816874B2 (en) Danger presentation device, danger presentation system, danger presentation method and program
US20150182287A1 (en) Nonforce reflecting method for providing tool force information to a user of a telesurgical system
Solanes et al. Teleoperation of industrial robot manipulators based on augmented reality
JP6179927B2 (ja) 遠隔行動誘導システム及びその処理方法
JP6517582B2 (ja) 作業安全支援装置、作業安全支援システム、および、作業安全支援方法
JP6261073B2 (ja) 救助訓練システム
KR20170090276A (ko) 가상현실 소방체험 시스템
US11897135B2 (en) Human-cooperative robot system
WO2021176645A1 (fr) Procédé de travail de fixation de machine utilisant la réalité augmentée
WO2023223388A1 (fr) Dispositif de surveillance de travail, procédé de surveillance de travail, programme de surveillance de travail, système d'entraînement de travail et procédé d'entraînement de travail
US20220176560A1 (en) Control system, control method, and control unit
CN113119982A (zh) 操作状态识别与处理方法、装置、设备、介质及程序产品
JP2018005261A (ja) フィードバック装置、フィードバック方法およびプログラム
US12032354B2 (en) Program restart assisting apparatus
JPWO2022114016A5 (ja) 直接教示操作を受付け可能な制御装置、教示装置、および制御装置のコンピュータプログラム
CN112001050B (zh) 设备调试管控方法、装置、电子设备及可读存储介质
WO2017017739A1 (fr) Système de gestion opérationnelle, procédé de gestion opérationnelle, dispositif de traitement d'information, et programme de gestion opérationnelle
EP4261636A1 (fr) Dispositif de traitement, système de traitement, visiocasque, procédé de traitement, programme et support de stockage
JP2016218916A (ja) 情報処理装置、情報処理方法、及びプログラム
JP2938805B2 (ja) 仮想物体操作方法および仮想物体表示装置
Chatzithanos et al. Fessonia: a method for real-time estimation of human operator workload using behavioural entropy
JP6816087B2 (ja) ロボットのアラーム通知システム
JP7475948B2 (ja) 訓練システム、方法及びプログラム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2022571865

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22942593

Country of ref document: EP

Kind code of ref document: A1