WO2016140129A1 - 動作評価装置、動作評価方法、及びコンピュータ読み取り可能な記録媒体 - Google Patents
動作評価装置、動作評価方法、及びコンピュータ読み取り可能な記録媒体 Download PDFInfo
- Publication number
- WO2016140129A1 WO2016140129A1 PCT/JP2016/055499 JP2016055499W WO2016140129A1 WO 2016140129 A1 WO2016140129 A1 WO 2016140129A1 JP 2016055499 W JP2016055499 W JP 2016055499W WO 2016140129 A1 WO2016140129 A1 WO 2016140129A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- motion
- visual expression
- human
- virtual space
- specific
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P13/00—Indicating or recording presence, absence, or direction, of movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/24—Use of tools
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
Definitions
- the present invention relates to an operation evaluation apparatus, an operation evaluation method, and a computer-readable recording medium in which a program for realizing these operations is recorded for evaluating an operation when performing an operation or the like.
- the above-mentioned purpose can be achieved by the worker himself / herself, for example, the worker voluntarily listening to an experienced worker or the operator carefully reading the manual.
- the response alone is not enough to achieve the objective.
- This is more likely to be a problem that the worker himself / herself cannot notice than a problem that the worker himself / herself can notice, for example, a near-miss, a mistake due to misunderstanding, or a problem that can be recognized only after use, This is because it often becomes a major impediment to achieving the objective.
- Non-Patent Document 1 One method for solving such problems is a work simulator that is executed on a computer (see, for example, Non-Patent Document 1).
- Non-Patent Document 1 discloses a work simulator.
- the work simulator disclosed in Non-Patent Document 1 generates a production line in a virtual space from product three-dimensional data, factory two-dimensional layout data, and assembly order information.
- the work simulator disclosed in Non-Patent Document 1 calculates various indexes and displays them on the screen.
- an index indicating whether or not the worker can take parts within the reach of the operator an index indicating whether or not the worker can perform assembly work within the range, necessary equipment, An index indicating whether or not the parts shelf can be arranged is given.
- Non-Patent Document 1 the administrator can grasp the movement of the worker by confirming various indicators, and thus cannot be noticed by the worker himself / herself. It is thought that can be solved.
- An example of an object of the present invention is to provide an operation evaluation apparatus, an operation evaluation method, and a computer-readable recording medium capable of solving the above-described problem and suppressing the missing of problems caused by the operation based on the actual operation. It is to provide.
- a motion evaluation apparatus includes a motion detection unit that detects a motion of a target human, An action evaluation unit that determines whether the detected action matches a preset specific action; When it is determined by the motion evaluation unit that the specific motion is matched, the portion of the human or the object representing the human displayed on the screen that is previously associated with the specific motion is visually
- a visual expression adding unit for adding expressions It is characterized by having.
- an operation evaluation method includes: (A) detecting a target human motion; and (B) determining whether the operation detected in step (a) matches a preset specific operation; and (C) When it is determined in step (b) that the specific action is matched, the human being or the object representing the human person displayed on the screen is associated with the specific action in advance. Adding visual expression to the part, It is characterized by having.
- a computer-readable recording medium On the computer, (A) detecting a target human motion; and (B) determining whether the operation detected in step (a) matches a preset specific operation; and (C) When it is determined in step (b) that the specific action is matched, the human being or the object representing the human person displayed on the screen is associated with the specific action in advance. Adding visual expression to the part, A program including an instruction for executing is recorded.
- FIG. 1 is a block diagram showing a schematic configuration of an operation evaluation apparatus according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing a specific configuration of the motion evaluation apparatus according to the embodiment of the present invention.
- FIG. 3 is a diagram showing an example of visual expression in the embodiment of the present invention.
- FIG. 4 is a flowchart showing the operation of the operation evaluation apparatus in the embodiment of the present invention.
- FIG. 5 is a block diagram illustrating an example of a computer that implements the motion evaluation apparatus according to the embodiment of the present invention.
- FIG. 1 is a block diagram showing a schematic configuration of an operation evaluation apparatus according to an embodiment of the present invention.
- the motion evaluation apparatus 10 includes a motion detection unit 11, a motion evaluation unit 12, and a visual expression adding unit 13.
- the motion detection unit 11 detects a target human motion.
- the motion evaluation unit 12 determines whether the motion detected by the motion detection unit 11 matches a preset specific motion.
- the visual expression adding unit 13 is a portion that is previously associated with the specific motion in the human or the object representing the human displayed on the screen. (Hereinafter referred to as “related part”) is added with visual expression.
- the motion evaluation apparatus 10 detects a specific motion that causes a problem from the actual motion of the target human being, the motion evaluation device 10 displays a visual expression indicating the specific motion. For this reason, according to the operation
- FIG. 2 is a block diagram showing a specific configuration of the motion evaluation apparatus according to the embodiment of the present invention.
- FIG. 3 is a diagram showing an example of visual expression in the embodiment of the present invention.
- the motion evaluation apparatus 10 is used to evaluate the motion of the worker 40 on a production line of a factory constructed in a virtual space.
- the motion evaluation apparatus 10 is connected to a VR (Virtual Reality) goggles 20 worn by the worker 40 and the manager 50 and a camera 30 for detecting the positions of the worker 40 and the manager 50.
- VR Virtual Reality
- the VR goggles 20 are equipped with a motion sensor (not shown in FIG. 2) for detecting the movement of the VR goggles 20 and an infrared sensor 21 for detecting the position and movement of the wearer's hand.
- the motion sensor includes an acceleration sensor and an angular velocity sensor, and outputs a signal for specifying the movement of the goggles 20 itself.
- the infrared sensor 21 includes a light source that outputs infrared light and an image sensor that receives the infrared light reflected by the wearer's hand, and outputs an infrared image as a signal.
- the VR goggles 20 and the sensor used therefor are not particularly limited. In the present embodiment, various VR goggles and sensors developed in the future can be used.
- the motion evaluation apparatus 10 includes a virtual space construction unit 14 that constructs a virtual space in addition to the motion detection unit 11, the motion evaluation unit 12, and the visual expression addition unit 13. It has.
- the virtual space construction unit 14 creates an object representing a person in the virtual space, and moves the object in accordance with the motion detected by the motion detection unit 11.
- the virtual space construction unit 14 includes an object 41 representing the worker 40 (hereinafter referred to as “worker object”) 41 and a production line in the constructed virtual space.
- An object 43 representing an equipment (hereinafter referred to as “equipment object”) 43 is created.
- the virtual space construction unit 14 also creates an object representing the administrator 50 (hereinafter referred to as “administrator object”) in the virtual space.
- the motion detection unit 11 performs the operations of the worker 40 and the manager 50 based on the signal from the motion sensor mounted on the VR goggles 20 and the signal from the infrared sensor 21. To detect. Further, the operation detection unit 11 detects the positions of the worker 40 and the administrator 50 based on the image data from the camera 30.
- the virtual space construction unit 14 determines the positions in the virtual space from the positions of the worker 40 and the administrator 50 detected by the motion detection unit 11. Is identified. In addition, the virtual space construction unit 14 operates the worker object 41 and the administrator object in accordance with the motion detected by the motion detection unit 11 at the specified position.
- the worker 40 and the administrator 50 can freely move around in the virtual space constructed by the motion evaluation device 10 or perform operations in the virtual space by wearing VR goggles, as if You can feel like you are in virtual space. Then, the administrator 50 can observe how the worker object 41 handles the equipment object 43.
- the visual expression adding unit 13 can add, for example, a dot set 42 as a visual expression to the related part of the worker object 41 as shown in FIG.
- the dot set 42 appears to the administrator 50 as a haze surrounding the worker object 41 and is translucent, so that the administrator 50 performs both the operation of the worker 40 and the problematic part once. Can be confirmed.
- the dot set 42 is preferably colored with a conspicuous color, for example, red or blue. Further, the color of the dot set 42 may be different depending on the portion of the worker object 41. Furthermore, in the example of FIG. 3, the shape of the dot set 42 is circular, but is not limited thereto, and may be spherical, for example.
- the motion evaluation unit 12 determines whether the motion detected by the motion detection unit 11 matches the specific motion.
- the specific operation include an operation in which the worker 40 moves his / her head up / down / left / right / front / back during the operation. Specifically, it is a case where the worker 40 does not understand the next work and looks around to find a place to work on or to search for a specific part. Usually, during work, it is rare to gather a line of sight at a specific point and move the head up and down, left and right, back and forth, so when the worker 40 takes such an action, the worker 40 becomes “lost”. There may be a problem.
- a “head” is associated with the specific action as a part related to the specific action.
- the operation evaluation unit 12 determines that the operation of the operator 40 matches the specific operation.
- the visual expression adding unit 13 adds a set of dots 42 as a visual expression around a portion corresponding to the head of the worker object 41.
- the visual expression adding unit 13 can also change the color intensity of the dot set 42 in accordance with the degree of the operation of moving the head of the worker 40.
- the manager 50 can determine how much the worker 40 is at a loss from the color density.
- ⁇ "Waist" is associated with this specific action as a part related to the specific action. Therefore, when the motion evaluation unit 12 determines that the motion of the worker 40 matches the specific motion, the visual expression adding unit 13 performs a process corresponding to the waist of the worker object 41 as shown in FIG. A set of dots 42 is added to the periphery as a visual expression.
- the work efficiency may be improved even if the posture is changed. Therefore, when setting an action to change the posture while performing the same work as a specific action, it is preferable to define the direction of the action.
- FIG. 4 is a flowchart showing the operation of the operation evaluation apparatus in the embodiment of the present invention.
- FIGS. 1 to 3 are referred to as appropriate.
- the operation evaluation method is performed by operating the operation evaluation apparatus. Therefore, the description of the operation evaluation method in the present embodiment is replaced with the following operation description of the operation evaluation apparatus 10.
- the virtual space construction unit 14 constructs a virtual space (step A1).
- the virtual space construction unit 14 creates a worker object 41, a facility object 43, and an administrator object in the constructed virtual space (step A2).
- Steps A1 and A2 When Steps A1 and A2 are executed, the worker 40 wearing the VR goggles 20 can work in the virtual space, and the administrator 50 can confirm the operation of the worker 40. Then, the motion detection unit 11 performs the operations of the worker 40 and the manager 50 based on the signal from the motion sensor mounted on the VR goggles 20 and the signal from the infrared sensor 21. Detect (step A3).
- step A4 determines whether or not the motion detected in step A3 matches the specific motion. As a result of the determination in step A4, if the operation detected in step A3 does not match the specific operation, step A3 is executed again. On the other hand, if the result of determination in step A4 is that the action detected in step A3 matches the specific action, step A5 is executed.
- step A5 the visual expression adding unit 13 adds the visual expression to the related part in the worker object 41 (step A6).
- the visual expression adding unit 13 places a dot on the portion corresponding to the “waist” of the worker object 41.
- a set 42 is added (see FIG. 3).
- step A6 determines whether or not an instruction to end the process is given. If the end of the process is not instructed as a result of the determination in step A6, the operation detection unit 11 executes step A3 again. On the other hand, if the end of the process is instructed as a result of the determination in step A6, the process in the motion evaluation apparatus 10 ends.
- the administrator 50 can confirm the specific operation of the worker 40 without omission, so that the oversight of problems caused by the specific operation is suppressed. That is, in this embodiment, even if the manager 40 is not an expert on the design of the production line, the problem can be noticed similarly to the expert. According to the present embodiment, it is possible to construct a production line in which the worker 40 can work efficiently and can reduce the physical burden on the worker. As a result, the occurrence of defective products can be suppressed.
- the manager 50 also wears the VR goggles 20 and observes the virtual space, like the worker 40, but the present embodiment is limited to the example of FIG. It is not a thing.
- the administrator 50 may observe the inside of the virtual space via the screen of the display device. Even in this aspect, the administrator 50 can confirm the visual expression added to the worker object 41.
- the dot set 42 is illustrated as the visual expression, but the visual expression is not particularly limited in the present embodiment.
- the visual expression is not particularly limited in the present embodiment.
- an aspect in which the color of the related portion of the worker object 41 changes may be used.
- the motion of the worker 40 is detected by a motion sensor mounted on the VR goggles 20, an infrared sensor 21 mounted on the motion sensor, and a position detection camera 30.
- the method of detecting the operation is not particularly limited.
- the motion detection may be performed using, for example, a motion capture technique.
- the motion evaluation apparatus 10 can be used for evaluating the motions of doctors and nurses in an operating room. In this case, the nurse can easily evaluate whether or not an appropriate surgical tool can be delivered to the doctor.
- the operation evaluation apparatus 10 can also be used to evaluate the operation of a service person when performing maintenance of various machines such as a bank automatic teller machine (ATM), a multifunction machine, and a machine tool.
- ATM bank automatic teller machine
- multifunction machine multifunction machine
- machine tool a machine tool
- the program in the embodiment of the present invention may be a program that causes a computer to execute steps A1 to A6 shown in FIG.
- a CPU Central Processing Unit
- the operation evaluation apparatus 10 and the operation evaluation method in the present embodiment can be realized.
- a CPU Central Processing Unit
- the computer functions as the motion detection unit 11, the motion evaluation unit 12, the visual expression addition unit 13, and the virtual space construction unit 14, and performs processing.
- FIG. 5 is a block diagram illustrating an example of a computer that implements the motion evaluation apparatus according to the embodiment of the present invention.
- the computer 110 includes a CPU 111, a main memory 112, a storage device 113, an input interface 114, a display controller 115, a data reader / writer 116, and a communication interface 117. These units are connected to each other via a bus 121 so that data communication is possible.
- the CPU 111 performs various operations by developing the program (code) in the present embodiment stored in the storage device 113 in the main memory 112 and executing them in a predetermined order.
- the main memory 112 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory).
- the program in the present embodiment is provided in a state of being stored in a computer-readable recording medium 120. Note that the program in the present embodiment may be distributed on the Internet connected via the communication interface 117.
- the storage device 113 includes a hard disk drive and a semiconductor storage device such as a flash memory.
- the input interface 114 mediates data transmission between the CPU 111 and an input device 118 such as a keyboard and a mouse.
- the display controller 115 is connected to the display device 119 and controls display on the display device 119.
- the data reader / writer 116 mediates data transmission between the CPU 111 and the recording medium 120, and reads a program from the recording medium 120 and writes a processing result in the computer 110 to the recording medium 120.
- the communication interface 117 mediates data transmission between the CPU 111 and another computer.
- the recording medium 120 include general-purpose semiconductor storage devices such as CF (Compact Flash (registered trademark)) and SD (Secure Digital), magnetic storage media such as a flexible disk, or CD- Optical storage media such as ROM (Compact Disk Read Only Memory) are listed.
- CF Compact Flash
- SD Secure Digital
- magnetic storage media such as a flexible disk
- CD- Optical storage media such as ROM (Compact Disk Read Only Memory) are listed.
- a motion detection unit for detecting a target human motion An action evaluation unit that determines whether the detected action matches a preset specific action; When it is determined by the motion evaluation unit that the specific motion is matched, the portion of the human or the object representing the human displayed on the screen that is previously associated with the specific motion is visually A visual expression adding unit for adding expressions;
- An operation evaluation apparatus comprising:
- a virtual space construction unit for constructing a virtual space is further provided,
- the virtual space construction unit creates the object representing the human in the virtual space, and moves the object in accordance with the motion detected by the motion detection unit,
- the visual expression adding unit adds the visual expression to a part of the object representing the person displayed on the screen, which is previously associated with the specific action;
- the operation evaluation apparatus according to attachment 1.
- the visual expression adding unit adds a set of dots as the visual expression;
- the operation evaluation apparatus according to attachment 1.
- step (Appendix 4) (A) detecting a target human motion; and (B) determining whether the operation detected in step (a) matches a preset specific operation; and (C) When it is determined in step (b) that the specific action is matched, the human being or the object representing the human person displayed on the screen is associated with the specific action in advance.
- An operation evaluation method characterized by comprising:
- step (Appendix 7) On the computer, (A) detecting a target human motion; and (B) determining whether the operation detected in step (a) matches a preset specific operation; and (C) When it is determined in step (b) that the specific action is matched, the human being or the object representing the human person displayed on the screen is associated with the specific action in advance. Adding visual expression to the part, The computer-readable recording medium which recorded the program containing the instruction
- the present invention it is possible to evaluate the operation based on the actual operation, and it is possible to suppress missing of problems caused by the operation.
- the present invention is useful for operation evaluation in a production line of a factory, operation evaluation in maintenance of various machines, operation evaluation in surgery, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Entrepreneurship & Innovation (AREA)
- General Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Medicinal Chemistry (AREA)
- Algebra (AREA)
- Computational Mathematics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Mathematical Physics (AREA)
- Pure & Applied Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
検出された動作が、予め設定された特定動作に合致しているかどうかを判定する、動作評価部と、
前記動作評価部によって、前記特定動作に合致していると判定された場合に、画面上に表示された前記ヒト又は前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、視覚表現を付加する、視覚表現付加部と、
を備えている、ことを特徴とする。
(a)対象となるヒトの動作を検出する、ステップと、
(b)前記(a)のステップで検出された動作が、予め設定された特定動作に合致しているかどうかを判定する、ステップと、
(c)前記(b)のステップによって、前記特定動作に合致していると判定された場合に、画面上に表示された前記ヒト又は前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、視覚表現を付加する、ステップと、
を有する、ことを特徴とする。
コンピュータに、
(a)対象となるヒトの動作を検出する、ステップと、
(b)前記(a)のステップで検出された動作が、予め設定された特定動作に合致しているかどうかを判定する、ステップと、
(c)前記(b)のステップによって、前記特定動作に合致していると判定された場合に、画面上に表示された前記ヒト又は前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、視覚表現を付加する、ステップと、
を実行させる命令を含む、プログラムを記録していることを特徴とする。
以下、本発明の実施の形態における、動作評価装置、動作評価方法、及びプログラムについて、図1~図5を参照しながら説明する。
最初に、図1を用いて、本実施の形態における動作評価装置の構成について説明する。図1は、本発明の実施の形態における動作評価装置の概略構成を示すブロック図である。図1に示すように、動作評価装置10は、動作検出部11と、動作評価部12と、視覚表現付加部13とを備えている。
次に、本実施の形態における動作評価装置の動作について図4を用いて説明する。図4は、本発明の実施の形態における動作評価装置の動作を示すフロー図である。以下の説明においては、適宜図1~図3を参酌する。また、本実施の形態では、動作評価装置を動作させることによって、動作評価方法が実施される。よって、本実施の形態における動作評価方法の説明は、以下の動作評価装置10の動作説明に代える。
以上のように、本実施の形態では、管理者50は、作業者40の特定動作を漏れなく確認することができるので、特定動作に起因する問題点の取りこぼしが抑制される。つまり、本実施の形態では、管理者40が、生産ラインの設計についての専門家でない場合であっても、専門家と同様に、問題点に気付くことができるようになる。本実施の形態によれば、作業者40が効率良く作業でき、且つ、作業者の身体的負担を軽減できる、生産ラインを構築できる。この結果、不良品の発生も抑制することができる。
図2の例では、管理者50も、作業者40と同様に、VRゴーグル20を装着して、仮想空間内を観察しているが、本実施の形態は、図2の例に限定されるものではない。例えば、動作評価装置10に、一般的な表示装置が接続されているのであれば、管理者50は、この表示装置の画面を介して、仮想空間内を観察していても良い。この態様であっても、管理者50は、作業者オブジェクト41に付加される視覚表現を確認することができる。
また、本実施の形態は、動作評価装置10が、生産ラインでの作業者40の動作の評価に用いられる例について説明しているが、動作評価装置10の用途はこれに限定されるものではない。例えば、動作評価装置10は、手術室での医師及び看護師の動作の評価に用いることもできる。この場合、看護師が、無理なく、医師に適切な手術道具を渡せるかどうかの評価を行なうことが可能となる。
本発明の実施の形態におけるプログラムは、コンピュータに、図4に示すステップA1~A6を実行させるプログラムであれば良い。このプログラムをコンピュータにインストールし、実行することによって、本実施の形態における動作評価装置10と動作評価方法とを実現することができる。この場合、コンピュータのCPU(Central Processing Unit)は、動作検出部11、動作評価部12、視覚表現付加部13、及び仮想空間構築部14として機能し、処理を行なう。
対象となるヒトの動作を検出する、動作検出部と、
検出された動作が、予め設定された特定動作に合致しているかどうかを判定する、動作評価部と、
前記動作評価部によって、前記特定動作に合致していると判定された場合に、画面上に表示された前記ヒト又は前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、視覚表現を付加する、視覚表現付加部と、
を備えている、ことを特徴とする動作評価装置。
仮想空間を構築する、仮想空間構築部を更に備え、
前記仮想空間構築部は、前記ヒトを表す前記オブジェクトを前記仮想空間内に作成し、前記動作検出部によって検出された動作に合せて、前記オブジェクトを動作させ、
前記視覚表現付加部は、画面上に表示された前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、前記視覚表現を付加する、
付記1に記載の動作評価装置。
前記視覚表現付加部が、前記視覚表現として、ドットの集合を付加する、
付記1に記載の動作評価装置。
(a)対象となるヒトの動作を検出する、ステップと、
(b)前記(a)のステップで検出された動作が、予め設定された特定動作に合致しているかどうかを判定する、ステップと、
(c)前記(b)のステップによって、前記特定動作に合致していると判定された場合に、画面上に表示された前記ヒト又は前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、視覚表現を付加する、ステップと、
を有する、ことを特徴とする動作評価方法。
(d)仮想空間を構築する、ステップと、
(e)前記ヒトを表す前記オブジェクトを前記仮想空間内に作成し、前記(a)のステップによって検出された動作に合せて、前記オブジェクトを動作させるステップと、
を更に有し、
前記(c)のステップにおいて、画面上に表示された前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、前記視覚表現を付加する、
付記4に記載の動作評価方法。
前記(c)のステップにおいて、前記視覚表現として、ドットの集合を付加する、
付記4に記載の動作評価方法。
コンピュータに、
(a)対象となるヒトの動作を検出する、ステップと、
(b)前記(a)のステップで検出された動作が、予め設定された特定動作に合致しているかどうかを判定する、ステップと、
(c)前記(b)のステップによって、前記特定動作に合致していると判定された場合に、画面上に表示された前記ヒト又は前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、視覚表現を付加する、ステップと、
を実行させる命令を含む、プログラムを記録しているコンピュータ読み取り可能な記録媒体。
前記プログラムが、前記コンピュータに、
(d)仮想空間を構築する、ステップと、
(e)前記ヒトを表す前記オブジェクトを前記仮想空間内に作成し、前記(a)のステップによって検出された動作に合せて、前記オブジェクトを動作させるステップと、
を実行させる命令を更に含み、
前記(c)のステップにおいて、画面上に表示された前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、前記視覚表現を付加する、
付記7に記載のコンピュータ読み取り可能な記録媒体。
前記(c)のステップにおいて、前記視覚表現として、ドットの集合を付加する、
付記7に記載のコンピュータ読み取り可能な記録媒体。
11 動作検出部
12 動作評価部
13 視覚表現付加部
14 仮想空間構築部
20 VRゴーグル
21 赤外線センサ
30 位置検出用のカメラ
40 作業者
41 作業者オブジェクト
42 ドットの集合
43 設備オブジェクト
50 管理者
110 コンピュータ
111 CPU
112 メインメモリ
113 記憶装置
114 入力インターフェイス
115 表示コントローラ
116 データリーダ/ライタ
117 通信インターフェイス
118 入力機器
119 ディスプレイ装置
120 記録媒体
121 バス
Claims (9)
- 対象となるヒトの動作を検出する、動作検出部と、
検出された動作が、予め設定された特定動作に合致しているかどうかを判定する、動作評価部と、
前記動作評価部によって、前記特定動作に合致していると判定された場合に、画面上に表示された前記ヒト又は前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、視覚表現を付加する、視覚表現付加部と、
を備えている、ことを特徴とする動作評価装置。 - 仮想空間を構築する、仮想空間構築部を更に備え、
前記仮想空間構築部は、前記ヒトを表す前記オブジェクトを前記仮想空間内に作成し、前記動作検出部によって検出された動作に合せて、前記オブジェクトを動作させ、
前記視覚表現付加部は、画面上に表示された前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、前記視覚表現を付加する、
請求項1に記載の動作評価装置。 - 前記視覚表現付加部が、前記視覚表現として、ドットの集合を付加する、
請求項1または2に記載の動作評価装置。 - (a)対象となるヒトの動作を検出する、ステップと、
(b)前記(a)のステップで検出された動作が、予め設定された特定動作に合致しているかどうかを判定する、ステップと、
(c)前記(b)のステップによって、前記特定動作に合致していると判定された場合に、画面上に表示された前記ヒト又は前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、視覚表現を付加する、ステップと、
を有する、ことを特徴とする動作評価方法。 - (d)仮想空間を構築する、ステップと、
(e)前記ヒトを表す前記オブジェクトを前記仮想空間内に作成し、前記(a)のステップによって検出された動作に合せて、前記オブジェクトを動作させるステップと、
を更に有し、
前記(c)のステップにおいて、画面上に表示された前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、前記視覚表現を付加する、
請求項4に記載の動作評価方法。 - 前記(c)のステップにおいて、前記視覚表現として、ドットの集合を付加する、
請求項4または5に記載の動作評価方法。 - コンピュータに、
(a)対象となるヒトの動作を検出する、ステップと、
(b)前記(a)のステップで検出された動作が、予め設定された特定動作に合致しているかどうかを判定する、ステップと、
(c)前記(b)のステップによって、前記特定動作に合致していると判定された場合に、画面上に表示された前記ヒト又は前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、視覚表現を付加する、ステップと、
を実行させる命令を含む、プログラムを記録しているコンピュータ読み取り可能な記録媒体。 - 前記プログラムが、前記コンピュータに、
(d)仮想空間を構築する、ステップと、
(e)前記ヒトを表す前記オブジェクトを前記仮想空間内に作成し、前記(a)のステップによって検出された動作に合せて、前記オブジェクトを動作させるステップと、
を実行させる命令を更に含み、
前記(c)のステップにおいて、画面上に表示された前記ヒトを表すオブジェクトにおける、前記特定動作に予め関連付けられている部分に、前記視覚表現を付加する、
請求項7に記載のコンピュータ読み取り可能な記録媒体。 - 前記(c)のステップにおいて、前記視覚表現として、ドットの集合を付加する、
請求項7または8に記載のコンピュータ読み取り可能な記録媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/553,838 US10593223B2 (en) | 2015-03-05 | 2016-02-24 | Action evaluation apparatus, action evaluation method, and computer-readable storage medium |
JP2017503437A JP6462847B2 (ja) | 2015-03-05 | 2016-02-24 | 動作評価装置、動作評価方法、及びプログラム |
CN201680013144.8A CN107408354B (zh) | 2015-03-05 | 2016-02-24 | 动作评估装置、动作评估方法和计算机可读存储介质 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-043583 | 2015-03-05 | ||
JP2015043583 | 2015-03-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016140129A1 true WO2016140129A1 (ja) | 2016-09-09 |
Family
ID=56848894
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/055499 WO2016140129A1 (ja) | 2015-03-05 | 2016-02-24 | 動作評価装置、動作評価方法、及びコンピュータ読み取り可能な記録媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10593223B2 (ja) |
JP (1) | JP6462847B2 (ja) |
CN (1) | CN107408354B (ja) |
WO (1) | WO2016140129A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019197165A (ja) * | 2018-05-10 | 2019-11-14 | 日本電気株式会社 | 作業訓練装置、作業訓練方法、およびプログラム |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10839717B2 (en) * | 2016-01-11 | 2020-11-17 | Illinois Tool Works Inc. | Weld training systems to synchronize weld data for presentation |
JP2017126935A (ja) * | 2016-01-15 | 2017-07-20 | ソニー株式会社 | 情報処理装置、情報処理システム、および情報処理方法、並びにプログラム |
CN111583733A (zh) * | 2020-05-12 | 2020-08-25 | 广东小天才科技有限公司 | 一种智能家教机及其交互方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003330971A (ja) * | 2002-05-15 | 2003-11-21 | Toshiba Corp | 標準作業書制作方法及び記憶媒体 |
JP2005134536A (ja) * | 2003-10-29 | 2005-05-26 | Omron Corp | 作業訓練支援システム |
JP2006171184A (ja) * | 2004-12-14 | 2006-06-29 | Toshiba Corp | 技能評価システムおよび技能評価方法 |
JP2006302122A (ja) * | 2005-04-22 | 2006-11-02 | Nippon Telegr & Teleph Corp <Ntt> | 運動支援システムとその利用者端末装置及び運動支援プログラム |
JP2013088730A (ja) * | 2011-10-21 | 2013-05-13 | Toyota Motor East Japan Inc | 技能習得支援システム及び技能習得支援方法 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6059576A (en) * | 1997-11-21 | 2000-05-09 | Brann; Theodore L. | Training and safety device, system and method to aid in proper movement during physical activity |
US9182883B2 (en) * | 2009-01-15 | 2015-11-10 | Social Communications Company | Communicating between a virtual area and a physical space |
US9146669B2 (en) | 2009-12-29 | 2015-09-29 | Bizmodeline Co., Ltd. | Password processing method and apparatus |
US9098873B2 (en) | 2010-04-01 | 2015-08-04 | Microsoft Technology Licensing, Llc | Motion-based interactive shopping environment |
CN103154982A (zh) * | 2010-08-16 | 2013-06-12 | 社会传播公司 | 促进网络通信环境中的通信者交互 |
US8832233B1 (en) * | 2011-07-20 | 2014-09-09 | Google Inc. | Experience sharing for conveying communication status |
AU2011205223C1 (en) * | 2011-08-09 | 2013-03-28 | Microsoft Technology Licensing, Llc | Physical interaction with virtual objects for DRM |
US10824310B2 (en) * | 2012-12-20 | 2020-11-03 | Sri International | Augmented reality virtual personal assistant for external representation |
US9652992B2 (en) * | 2012-10-09 | 2017-05-16 | Kc Holdings I | Personalized avatar responsive to user physical state and context |
US9198622B2 (en) * | 2012-10-09 | 2015-12-01 | Kc Holdings I | Virtual avatar using biometric feedback |
US9199122B2 (en) * | 2012-10-09 | 2015-12-01 | Kc Holdings I | Personalized avatar responsive to user physical state and context |
US9501942B2 (en) * | 2012-10-09 | 2016-11-22 | Kc Holdings I | Personalized avatar responsive to user physical state and context |
US9685001B2 (en) * | 2013-03-15 | 2017-06-20 | Blackberry Limited | System and method for indicating a presence of supplemental information in augmented reality |
WO2015108700A1 (en) * | 2014-01-14 | 2015-07-23 | Zsolutionz, LLC | Sensor-based evaluation and feedback of exercise performance |
-
2016
- 2016-02-24 JP JP2017503437A patent/JP6462847B2/ja active Active
- 2016-02-24 US US15/553,838 patent/US10593223B2/en active Active
- 2016-02-24 CN CN201680013144.8A patent/CN107408354B/zh active Active
- 2016-02-24 WO PCT/JP2016/055499 patent/WO2016140129A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003330971A (ja) * | 2002-05-15 | 2003-11-21 | Toshiba Corp | 標準作業書制作方法及び記憶媒体 |
JP2005134536A (ja) * | 2003-10-29 | 2005-05-26 | Omron Corp | 作業訓練支援システム |
JP2006171184A (ja) * | 2004-12-14 | 2006-06-29 | Toshiba Corp | 技能評価システムおよび技能評価方法 |
JP2006302122A (ja) * | 2005-04-22 | 2006-11-02 | Nippon Telegr & Teleph Corp <Ntt> | 運動支援システムとその利用者端末装置及び運動支援プログラム |
JP2013088730A (ja) * | 2011-10-21 | 2013-05-13 | Toyota Motor East Japan Inc | 技能習得支援システム及び技能習得支援方法 |
Non-Patent Citations (2)
Title |
---|
KAZUHIRO MIYASA ET AL.: "A Recording and Visualization Method of Working Activities in a Mixed Reality Space", TRANSACTIONS OF INFORMATION PROCESSING SOCIETY OF JAPAN, vol. 47, no. 1, 15 January 2006 (2006-01-15), pages 181 - 192 * |
NORIHIRO TAKAHASHI ET AL.: "Posture Estimation from Bird's-eye View of Range Image Sensor", FIT2007 DAI 6 KAI FORUM ON INFORMATION TECHNOLOGY IPPAN KOEN RONBUNSHU SEPARATE, vol. 3, 22 August 2007 (2007-08-22), pages 79 - 80 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019197165A (ja) * | 2018-05-10 | 2019-11-14 | 日本電気株式会社 | 作業訓練装置、作業訓練方法、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
JP6462847B2 (ja) | 2019-01-30 |
JPWO2016140129A1 (ja) | 2018-01-25 |
US20180053438A1 (en) | 2018-02-22 |
CN107408354A (zh) | 2017-11-28 |
US10593223B2 (en) | 2020-03-17 |
CN107408354B (zh) | 2019-12-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10179407B2 (en) | Dynamic multi-sensor and multi-robot interface system | |
JP6462847B2 (ja) | 動作評価装置、動作評価方法、及びプログラム | |
US9808549B2 (en) | System for detecting sterile field events and related methods | |
JP2017059062A5 (ja) | ||
US10614590B2 (en) | Apparatus for determination of interference between virtual objects, control method of the apparatus, and storage medium | |
JP7471428B2 (ja) | 製造又は産業環境における協調ロボットとの相互作用のために仮想/拡張現実を使用するためのシステム、方法及びコンピュータ読み取り可能媒体 | |
CN106168888B (zh) | 便携式显示装置、显示系统、显示方法 | |
JP6863927B2 (ja) | ロボットのシミュレーション装置 | |
US20200184736A1 (en) | Information processing apparatus, system, image processing method, and storage medium | |
WO2021202609A1 (en) | Method and system for facilitating remote presentation or interaction | |
JP2021002290A (ja) | 画像処理装置およびその制御方法 | |
JP2020011357A (ja) | 制御装置、ヘッドマウントディスプレイおよびロボットシステム | |
JP2017016376A (ja) | 情報処理装置、情報処理方法、プログラム | |
JP7165108B2 (ja) | 作業訓練システム及び作業訓練支援方法 | |
JP7125872B2 (ja) | 作業支援装置、および、作業支援方法 | |
JP7011569B2 (ja) | 熟練度判定システム | |
JP6885909B2 (ja) | ロボット制御装置 | |
JPWO2018100631A1 (ja) | 情報処理装置 | |
US20210256865A1 (en) | Display system, server, display method, and device | |
JP2019219917A (ja) | 表示システム、ウェアラブル装置及び監視制御装置 | |
JP7406038B1 (ja) | 作業支援システム及び作業支援方法 | |
JP2024007951A (ja) | 遠隔作業支援システム、遠隔作業支援方法およびプログラム | |
JP2021162967A (ja) | 作業管理システムおよび作業管理方法 | |
JP2023156237A (ja) | 処理装置、処理システム、頭部装着ディスプレイ、処理方法、プログラム、及び記憶媒体 | |
JP2023156869A (ja) | 作業支援システム及び方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16758823 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15553838 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2017503437 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16758823 Country of ref document: EP Kind code of ref document: A1 |