WO2019207732A1 - Operation assistance device - Google Patents

Operation assistance device Download PDF

Info

Publication number
WO2019207732A1
WO2019207732A1 PCT/JP2018/017060 JP2018017060W WO2019207732A1 WO 2019207732 A1 WO2019207732 A1 WO 2019207732A1 JP 2018017060 W JP2018017060 W JP 2018017060W WO 2019207732 A1 WO2019207732 A1 WO 2019207732A1
Authority
WO
WIPO (PCT)
Prior art keywords
work
sensor signal
unit
sensor
feature amount
Prior art date
Application number
PCT/JP2018/017060
Other languages
French (fr)
Japanese (ja)
Inventor
智哉 藤田
佐藤 剛
堀 淳志
森 健太郎
松原 厚
大輔 河野
孝 楠見
Original Assignee
三菱電機株式会社
国立大学法人京都大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社, 国立大学法人京都大学 filed Critical 三菱電機株式会社
Priority to JP2020515406A priority Critical patent/JP7128267B2/en
Priority to PCT/JP2018/017060 priority patent/WO2019207732A1/en
Publication of WO2019207732A1 publication Critical patent/WO2019207732A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/04Manufacturing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Definitions

  • the present invention relates to a work support apparatus that supports work proficiency of unskilled workers.
  • Training for work proficiency includes memorizing procedures or instructions, repeating the same procedure using a product for practice, or receiving guidance from a skilled worker or supervisor at the site, etc. Various forms are taken depending on the target work or the site where the work is performed.
  • Patent Literature 1 proposes an operation training apparatus that can evaluate an improvement in the level of proficiency of operation training.
  • a skilled worker who decides the machining conditions to be set on an NC (Numerical Control) machine tool can use the sensory organs of the body as a difference in the machining state of the machining object as a difference in sound or in the appearance of chips.
  • the processing conditions for realizing the required processing accuracy can be determined.
  • unskilled workers cannot sufficiently sense the slight difference in the processing state of the object to be processed by the body's sensory organs because the body's sensory organs are not sufficiently trained. For this reason, there is a problem that it is difficult for an unskilled worker to perform a work equivalent to a work performed by a skilled worker.
  • the present invention has been made in view of the above, and obtains a work support device capable of enabling even an unskilled worker to perform work equivalent to work performed by a skilled worker. For the purpose.
  • the work support device works with an acquisition unit that acquires a sensor signal of a sensor that measures a state quantity of a work environment or a work object, and a sensor signal.
  • a conversion unit that converts the feature quantity into a feature quantity that can be sensed by the sensory organ of the person, and an output unit that outputs the feature quantity so that the sensory organ of the worker can sense it.
  • the work support apparatus according to the present invention has an effect that even an unskilled worker can perform a work equivalent to a work performed by a skilled worker.
  • the figure for demonstrating the structure of the work assistance apparatus concerning Embodiment 1 of this invention The figure for demonstrating the Example which applied the work assistance apparatus concerning Embodiment 1 of this invention to the determination work of the process condition in NC machine tool.
  • the perspective view which shows schematic structure of the NC machine tool shown in FIG. The figure which shows an example of the image which displayed the feature-value of the vector information extracted from the sensor signal of the force sensor shown in FIG.
  • the figure for demonstrating the structure of the sensor information conversion part shown in FIG. The figure for demonstrating the Example which applied the work assistance apparatus concerning Embodiment 2 of this invention to the adjustment work of the parameter in the pick and place of a robot.
  • the perspective view which shows schematic structure of the robot shown in FIG. The figure for demonstrating an example of arrangement
  • the figure which shows an example of the input screen of the conditions input into the training evaluation input part shown in FIG. The figure which shows an example of the screen which displayed on the display apparatus the conditions preserve
  • FIG. 1 is a diagram for explaining the configuration of the work support apparatus according to the first embodiment of the present invention.
  • the work support device 100a shown in FIG. 1 supports work proficiency of an unskilled worker 1 (hereinafter simply referred to as “worker 1”).
  • the work support apparatus 100a includes a sensor signal acquisition unit 2, a sensor information conversion unit 3, and a sensory information output unit 4.
  • the sensor signal acquisition unit 2 corresponds to an acquisition unit.
  • the sensor information conversion unit 3 corresponds to a conversion unit.
  • the sensory information output unit 4 corresponds to an output unit.
  • the sensor signal acquisition unit 2 acquires a sensor signal SS that is a measurement result of the sensor group 101.
  • the sensor group 101 includes at least one sensor. Some sensors constituting the sensor group 101 are provided in the work environment 102 and measure the physical quantity PQ1 of the work environment 102. The physical quantity PQ1 corresponds to the state quantity. The other sensors constituting the sensor group 101 are provided on the work target 103 and measure the physical quantity PQ2 of the work target 103. The physical quantity PQ2 corresponds to the state quantity.
  • the sensors constituting the sensor group 101 may be provided only in one of the work environment 102 and the work object 103.
  • the sensor information conversion unit 3 converts the sensor signal SS acquired by the sensor signal acquisition unit 2 into a feature amount FV that can be sensed by a human sensory organ by calculation or the like.
  • the sensory information output unit 4 outputs sensory information SI including the feature amount FV converted by the sensor information converter 3 to the sensory generating device group 104.
  • the sensation generating device group 104 includes at least one sensation generating device. The sensation generating device group 104 transmits the feature value FV to the sensory organ of the worker 1.
  • FIG. 2 is a diagram for explaining an example in which the work support apparatus according to the first embodiment of the present invention is applied to a machining condition determination work in an NC machine tool.
  • FIG. 3 is a perspective view showing a schematic configuration of the NC machine tool shown in FIG.
  • the NC machine tool 5 shown in FIG. 3 is an NC machine tool called a machining center, and uses a tool to remove unnecessary portions of the workpiece 6 and to machine the workpiece 6 into a target shape.
  • the NC machine tool 5 includes feed shaft mechanisms 20x, 20y, and 20z, a table 24, a column 25, a ram 26, and a main shaft 27.
  • the feed shaft mechanism 20x includes a rotary motor 21x, a feed screw 22x that is a feed shaft, and a rotation angle detector 23x.
  • the feed shaft mechanism 20x converts the rotational motion of the rotary motor 21x into linear motion by the feed screw 22x. Since the feed shaft mechanisms 20y and 20z are the same as the feed shaft mechanism 20x, description thereof is omitted.
  • the rotation angles of the rotation motors 21x, 21y, and 21z are detected by the rotation angle detectors 23x, 23y, and 23z, and the rotation angles of the rotation motors 21x, 21y, and 21z are determined.
  • Arbitrary three-dimensional motion can be realized by numerical control of each.
  • a tool can be attached to the tip of the main shaft 27, the main shaft 27 is rotated, and the relative position between the tool and the processing workpiece 6 placed on the table 24 is controlled, whereby the processing workpiece 6 is moved. Unnecessary portions are removed and the workpiece 6 is machined into a target shape.
  • the ram 26 is provided with an acceleration sensor 71 for measuring the acceleration of vibration of the ram 26.
  • a force sensor 72 that measures a processing force in three directions orthogonal to the feed screws 22x, 22y, and 22z, which are feed shafts, is attached to the work piece 6.
  • the work support device 100a when the work support device 100a is applied to the machining condition determination work in the NC machine tool 5, the work environment 102 becomes the NC machine tool 5, the work object 103 becomes the work piece 6, and the sensor group 101a.
  • the sensory generators constituting the sensory generator group 104a are a vibration generator 81 and a display device 82.
  • the machining conditions for the NC machine tool 5 are determined by machining the product workpiece 6 using the workpiece 6 made of a new material, or by machining the workpiece 6 with the NC machine tool 5 when a new NC machine tool 5 is introduced. This is an operation for determining the machining conditions to be set in the NC machine tool 5 at the time.
  • the worker 1 determines parameters for the rotational speed, feed speed, cutting amount, and pick feed amount for realizing the target value of the machined surface accuracy of the work 6. Each parameter of the rotation speed, feed speed, cutting amount, and pick feed amount corresponds to the processing conditions.
  • the NC machine tool 5 generates a machining program related to the number of rotations per unit time of the main shaft 27 and the motion trajectory of the feed screws 22x, 22y, and 22z that are feed axes, based on each parameter that is a set machining condition. .
  • the NC machine tool 5 processes the workpiece 6 by controlling the movements of the main shaft 27 and the feed screws 22x, 22y, and 22z, which are feed axes, based on the machining program.
  • the acceleration sensor 71 measures the acceleration of vibration generated during machining.
  • the force sensor 72 measures a machining force generated during machining.
  • the sensor signal acquisition unit 2 of the work support apparatus 100a acquires a sensor signal SSa that is a measurement result of the acceleration sensor 71.
  • the sensor information conversion unit 3 of the work support device 100a performs a filtering process on the sensor signal SSa acquired by the sensor signal acquisition unit 2 to convert the sensor signal SSa into a feature value related to acceleration. In this embodiment, for example, a high-frequency vibration component is removed by a low-pass filter.
  • the sensory information output unit 4 of the work support device 100 a outputs sensory information SIa including a feature amount related to acceleration to the vibration generator 81.
  • the vibration generating device 81 transmits a feature amount related to acceleration to the sensory organ of the worker 1.
  • the vibration generator 81 is exemplified by a game controller or a vibrator used for a mobile phone.
  • the vibration generator 81 outputs a time waveform of a feature amount related to acceleration as vibration. As a result, vibration generated during processing can be transmitted to the tactile sense of the operator 1.
  • the sensor signal acquisition unit 2 acquires a sensor signal SSb that is a measurement result of the force sensor 72.
  • the sensor information conversion unit 3 converts the sensor signal SSb acquired by the sensor signal acquisition unit 2 into a feature amount of vector information indicating the direction of force and the magnitude of force.
  • the sensory information output unit 4 outputs sensory information SIb including the feature amount of the force vector information to the display device 82.
  • the display device 82 transmits the feature amount of the force vector information to the sensory organ of the worker 1.
  • the display device 82 is exemplified by a video monitor or wearable glasses.
  • the display device 82 outputs an image displaying the feature amount of the force vector information. Thereby, the processing force generated during the processing can be transmitted to the visual sense of the worker 1.
  • FIG. 4 is a diagram showing an example of an image displaying the feature amount of the vector information extracted from the sensor signal SSb of the force sensor 72 shown in FIG.
  • the direction of the force and the magnitude of the force measured every 10 ms by the force sensor 72 are continuously displayed as arrows, that is, vectors, and the arrow having a large force is a solid line.
  • An arrow having an intermediate magnitude of force is indicated by a dashed arrow, and an arrow having a small magnitude is indicated by a dotted arrow.
  • the operator 1 who has seen the image shown in FIG. 4 can recognize the possibility that a difference occurs in the machining state because the machining force is large at the portion indicated by the solid line arrow.
  • a skilled worker who determines the machining conditions to be set in the NC machine tool 5 senses the difference in the machining state of the workpiece 6 with the sense organs of the body as a difference in sound or a difference in the appearance of chips. Processing conditions for realizing processing accuracy can be determined.
  • the worker 1 since the training of the body sensory organ is not sufficiently performed, the worker 1 cannot sense a slight difference in the machining state of the workpiece 6 with the body sensory organ.
  • the vibration and the machining force generated during the machining are performed by transmitting the feature quantity related to acceleration and the feature quantity of vector information to the sensory organ of the worker 1. 1 can be made to recognize. Thereby, even the worker 1 can perform work equivalent to the work performed by the skilled worker.
  • the worker 1 can obtain the processing information determined by the skilled worker based on the sound generated during the processing, how to fly the chips, etc. as vibration and visual information, so that the cognitive ability in the sensory organ is enhanced. Therefore, it is possible to determine the machining state.
  • the operator 1 can identify a location where an abnormality or a machining error has occurred by comparing the machining result with the machining information. As a result, the operator 1 can determine the machining conditions so that no abnormality or machining error occurs, and can determine appropriate machining conditions.
  • the NC machine tool 5 generates a machining program based on each parameter that is a set machining condition.
  • a CAM Computer
  • a machining program created by (Aided Manufacturing) software may be transferred.
  • the work support device 100a is applied to the NC machine tool 5 which is a milling type machining center.
  • the work support device 100a is a lathe-type NC machine tool, a 5-axis machine tool, a multi-task machine, laser processing.
  • the present invention may be applied to an NC machine tool that creates a target shape using a machine configuration different from a milling type machining center such as a machine or an electric discharge machine, or using a different machining principle.
  • the filtering process is performed by a low-pass filter, but may be performed by any one of a low-pass filter, a high-pass filter, a band-pass filter, a band-eliminate filter, or a combination of two or more thereof.
  • the feature amount may be extracted by integrating a plurality of sensor signals as a vector, or the signal of the three-axis sensor may be extracted as a one-dimensional scalar amount.
  • the acceleration sensor 71 and the force sensor 72 are used as sensors constituting the sensor group 101a, but a temperature sensor, a laser displacement meter, a Doppler vibrometer, or the like may be used.
  • a plurality of types of sensors may be provided on the same object, or a plurality of sensors of the same type may be provided on the same object.
  • the vibration generating device 81 and the display device 82 are used as the sensation generating devices constituting the sensation generating device group 104a, but a sensation generating device such as a sound generating device may be used.
  • the sensory information output unit 4 may output sensory information SIa and SIb with a delay from the acquisition timing of the sensor signals SSa and SSb.
  • Sensory information output part 4 may output sensory information SIa and SIb repeatedly.
  • the sensory information output unit 4 may output sensory information SIa and SIb in a time shorter or longer than a time required for measuring the sensor signals SSa and SSb.
  • FIG. 5 is a diagram for explaining the configuration of the work support apparatus according to the second embodiment of the present invention.
  • FIG. 6 is a diagram for explaining the configuration of the sensor information conversion unit shown in FIG.
  • the work support apparatus 100b according to the second embodiment of the present invention is mainly different from the first embodiment described above in that the output method setting unit 7 is provided and the configuration of the sensor information conversion unit 3 is different. A description of the same configuration and operation as those in the first embodiment will be omitted, and a description of a different configuration and operation will be given below.
  • the worker 1 sets the output method of the feature amount in the output method setting unit 7.
  • the output method setting unit 7 outputs a feature amount output method setting command SC to the sensor information conversion unit 3.
  • the output method setting unit 7 corresponds to the first setting unit.
  • the 6 includes a sensory information setting unit 31, an algorithm storage unit 32, and a conversion calculation execution unit 33.
  • the setting command SC output from the output method setting unit 7 is input to the sensory information setting unit 31.
  • the sensory information setting unit 31 outputs to the algorithm storage unit 32 a selection command CC for an arithmetic expression used in the calculation of the feature amount based on the input setting command SC.
  • the algorithm storage unit 32 selects an arithmetic expression to be used for calculating the feature amount from the arithmetic expressions stored in the algorithm storage unit 32, and executes the conversion operation on the information AEI of the selected arithmetic expression. To the unit 33.
  • the conversion calculation execution unit 33 calculates the feature value FV from the sensor signal SS and the calculation formula information AEI, and outputs the feature value FV to the sensory information output unit 4.
  • the sensory information output unit 4 selects a sensory generation device corresponding to the setting command SC and the feature amount FV, and outputs sensory information SI to the selected sensory generation device.
  • the worker 1 can set the feature value output method, and can transmit the feature value to the optimal sensory organ of the worker 1.
  • FIG. 7 is a diagram for explaining an example in which the work support apparatus according to the second embodiment of the present invention is applied to a parameter adjustment work in a robot pick-and-place.
  • FIG. 8 is a perspective view showing a schematic configuration of the robot shown in FIG.
  • FIG. 9 is a diagram for explaining an example of the arrangement of the sensation generating device group and workers shown in FIG.
  • the robot 8 shown in FIG. 8 grips the target workpiece 9 and moves the target workpiece 9.
  • the robot 8 includes a robot hand 51, a robot arm 52, and a motor 53 with a speed reducer.
  • the operations of the robot hand 51 and the motor 53 with a speed reducer are numerically controlled by the robot controller 73.
  • the robot hand 51 is attached with force in three directions perpendicular to the robot hand 51 and a force sensor 74 for measuring the torque of each axis of the robot hand 51.
  • the work support apparatus 100b when the work support apparatus 100b is applied to parameter adjustment work in pick and place of the robot 8, the work environment 102 is the robot 8, the work object 103 is the target work 9, and the sensor group 101b is set.
  • the constituting sensors are the robot controller 73 of the robot 8 and the force sensor 74 provided in the robot hand 51, and the sensation generating devices constituting the sensation generating device group 104b are the vibration generating device 81, the display device 82, the sound generating device 83, and the force. It becomes the sense generating device 84.
  • FIG. 9 the worker 1 wears a vibration generating device 81a, a display device 82a, and a sound generating device 83a.
  • a sound generator 83b and a force sense generator 84a are attached to the robot controller 73.
  • the worker 1 sets the feature value output method in the output method setting unit 7 of the work support device 100b using the input device.
  • the input device is exemplified by a keyboard, a mouse, a button, or a switch.
  • the worker 1 is set to transmit the feature amount of the haptic information as a sound
  • the worker 1 is set to output the feature amount of the haptic information as a sound by the sound generator 83.
  • the worker 1 sets a band to be extracted.
  • the conversion calculation execution unit 33 performs filter processing for separating the signal in the band set by the operator 1 on the sensor signal SSd of the force sensor 74 to calculate the feature value FV. Become.
  • Worker 1 determines the pick-and-place parameters of the robot 8.
  • the robot controller 73 acquires a current feedback value signal and a motor angle signal of each motor 53 with a speed reducer in the pick-and-place of the robot 8.
  • the force sensor 74 measures three orthogonal forces generated in the robot hand 51 and the torque of each axis of the robot hand 51 in the pick-and-place of the robot 8.
  • the sensor signal acquisition unit 2 of the work support apparatus 100b acquires a sensor signal SSd that is a measurement result of the force sensor 74.
  • the sensor information conversion unit 3 of the work support device 100b performs filtering processing on the sensor signal SSd acquired by the sensor signal acquisition unit 2 to separate the signal in the band set by the worker 1 and converts it into a feature amount of sound information. To do.
  • the sensory information output unit 4 of the work support device 100a outputs sensory information SIc including the feature amount of the sound information to the sound generator 83.
  • the sound generator 83 transmits the feature amount of the sound information to the sensory organ of the worker 1.
  • the sound generator 83 is exemplified by an earphone, and outputs the feature amount of the sound information with the strength of the sound.
  • the worker 1 can set the output method of the feature amount by using the input device. Thereby, even if the worker 1 has different physical abilities, preferences, senses, and the like, it is possible to transmit the feature quantity to the most suitable sensory organ, and it is possible to engage various human resources in the work.
  • the worker 1 can also output the feature amount extracted from the sensor signal SSd of the force sensor 74 to another sensory generator of the sensory generator group 104b.
  • the vibration information Can transmit vibration.
  • a sensory generation device using auditory, visual, and tactile sensations is used.
  • a plurality of fragrances that generate odors are prepared, and an olfactory generation device that transmits gripping force by odor difference
  • an olfactory generating device that controls the strength of an exhaust fan or the like of a casing in which a fragrance is sealed, and transmits gripping force with the intensity of the odor.
  • FIG. 10 is a diagram for explaining the configuration of the work support apparatus according to the third embodiment of the present invention.
  • the work support apparatus 100c according to the third embodiment of the present invention is mainly different from the above-described first embodiment in that the information storage unit 10 is provided.
  • the description of the same configuration and operation as those in the first embodiment will be omitted, and a description of the different configuration and operation will be given below.
  • the information storage unit 10 stores the sensor signal CSS acquired by the sensor signal acquisition unit 2 in the current operation and the sensor signal PSS acquired by the sensor signal acquisition unit 2 in the past operation.
  • the information storage unit 10 corresponds to the first storage unit.
  • the sensor signal CSS corresponds to the first sensor signal.
  • the sensor signal PSS corresponds to the second sensor signal.
  • the sensor information conversion unit 3 converts the sensor signal CSS acquired by the sensor signal acquisition unit 2 into a feature amount FVa that can be sensed by a human sensory organ by calculation or the like.
  • the information storage unit 10 may store the feature amount FVa.
  • the sensor information conversion unit 3 converts the sensor signal PSS stored in the information storage unit 10 into a feature amount FVb that can be sensed by a human sensory organ by calculation or the like.
  • the information storage unit 10 may store the feature amount FVb.
  • the sensory information output unit 4 outputs sensory information SIa including the feature amount FVa and sensory information SIb including the feature amount FVb converted by the sensor information conversion unit 3 to the sensory generation device group 104, respectively.
  • the sensation generating device group 104 transmits the feature value FVa and the feature value FVb to the sensory organ of the worker 1.
  • the work environment 102 in the current work and the physical quantities PQ1 and PQ2 of the work object 103 can be transmitted to the sensory organ of the worker 1 as the feature quantity FVa.
  • the physical quantities PQ1 and PQ2 of the work environment 102 and the work object 103 in the past work can be transmitted to the sensory organ of the worker 1 as the feature quantity FVb.
  • the sensation generating devices constituting the sensation generating device group 104 may transmit the past feature quantity FVb and the current feature quantity FVa to the worker 1 simultaneously or sequentially. Good.
  • the sensation generating devices constituting the sensation generating device group 104 may transmit the difference between the past feature value FVb and the current feature value FVa to the worker 1.
  • the past feature quantity FVb and the current feature quantity FVa may be displayed in an overlapping manner. In this case, for example, it is possible to easily understand the difference in effect between the machining conditions currently set by the worker 1 and the machining conditions set in the past.
  • the sound generator 83 is used as the sensation generator, the difference between the past feature value FVb and the current feature value FVa may be transmitted to the worker 1 with the volume level.
  • FIG. 11 is a diagram for explaining the configuration of the work support apparatus according to the fourth embodiment of the present invention.
  • the work support apparatus 100d according to the fourth embodiment of the present invention is mainly different from the above-described third embodiment in that it includes a training evaluation input unit 11.
  • the description of the same configuration and operation as those of the third embodiment will be omitted, and a description of the different configuration and operation will be given below.
  • the training instructor 12 inputs a condition and training evaluation for the current work of the worker 1 to the training evaluation input unit 11 using the input device.
  • the training evaluation input unit 11 corresponds to the first input unit.
  • the input device is exemplified by a keyboard, a mouse, a button, or a switch.
  • FIG. 12 is a diagram illustrating an example of an input screen for conditions input to the training evaluation input unit 11 illustrated in FIG. 11.
  • FIG. 12 shows an input screen for machining conditions in the NC machine tool 5 described above.
  • the training evaluation that the training instructor 12 inputs to the training evaluation input unit 11 is whether the work performed by the worker 1 is good or bad.
  • the training instructor 12 inputs the training evaluation to the training evaluation input unit 11 may be a binary input such as pass or fail, or may be a five-step evaluation by numbers, with an evaluation on a 100-point scale. There may be.
  • the condition and the training evaluation input to the training evaluation input unit 11 are stored in the information storage unit 10 in association with the current sensor signal CSS.
  • FIG. 13 is a diagram illustrating an example of a screen in which the conditions and training evaluation stored in the information storage unit 10 illustrated in FIG. 11 are displayed on the display device 82 for each worker.
  • the conditions and training evaluations that the training instructor 12 inputs to the training evaluation input unit are displayed on the three-dimensional map.
  • a case where the training evaluation is good is indicated by a circle, and a case where the training evaluation is poor is indicated by a cross.
  • the worker 1 recognizes the conditions and the training evaluation, so that the points to be improved by the work and the disadvantages of the work of the worker 1 can be obtained without receiving direct guidance from the training leader 12. It becomes possible to know.
  • the worker 1 is a foreign worker who can speak only a language different from the training instructor 12, and the worker 1 and the training instructor 12 have sufficient language. Even if it is not possible to communicate smoothly, it is possible to make the worker 1 recognize the training evaluation.
  • FIG. 14 is a diagram for explaining the configuration of the work support apparatus according to the fifth embodiment of the present invention.
  • the work support apparatus 100e according to the fifth embodiment of the present invention is mainly different from the above-described fourth embodiment in that the output method setting unit 7, the know-how input unit 13, and the know-how storage unit 14 are provided.
  • the description of the same configuration and operation as those of the second and fourth embodiments will be omitted, and a description of the different configuration and operation will be given below.
  • the work support apparatus 100e shown in FIG. 14 is used for handing down the skills of the training instructor 12 that are difficult to document to the worker 1.
  • the training instructor 12 corresponds to the first worker.
  • Worker 1 corresponds to a second worker.
  • the work support apparatus 100e includes an output method setting unit 7, a know-how input unit 13, and a know-how storage unit 14.
  • the training instructor 12 inputs know-how information to the know-how input unit 13.
  • the know-how information is exemplified by the type of sensor to be referred to, the algorithm to be used, the type of sensation generating device that is easy to perceive, and the tendency of the feature value that provides a good result.
  • the know-how input unit 13 corresponds to the second input unit.
  • the know-how storage unit 14 stores the know-how information input to the know-how input unit 13.
  • the know-how storage unit 14 corresponds to a second storage unit.
  • FIG. 15 is a flowchart showing an example of a procedure for work training using the work support apparatus 100e shown in FIG.
  • the training instructor 12 performs a work demonstration using the work support device 100e (step S1).
  • the training instructor 12 inputs his / her know-how information into the know-how input unit 13 (step S2).
  • the know-how information input to the know-how input unit 13 is stored in the know-how storage unit 14, and the sensor signal SS that is being worked on by the training instructor 12 is stored in the information storage unit 10 as sensor information acquired in the past work.
  • the worker 1 uses the work support device 100e to perform the same work as the work performed by the training instructor 12 (step S3).
  • the sensor signal during work by the worker 1 is stored in the information storage unit 10 as sensor information acquired in the current work.
  • the worker 1 includes a feature amount extracted from the sensor signal SS when the training instructor 12 performs the work, and a feature amount extracted from the sensor signal SS when the worker 1 performs his / her work. Is transmitted.
  • the worker 1 refers to the know-how information stored in the know-how storage unit 14 and changes the output method (step S4).
  • the target skill is not mastered (No in step S5), the procedures of step S3 and step S4 are repeated.
  • the target skill is mastered (Yes in step S5), the work training is terminated.
  • the feature amount extracted from the sensor signal SS when the training instructor 12 performs work and the feature extracted from the sensor signal SS when the worker 1 performs his / her work. Since the quantity can be compared, the operator 1 can transfer the skill to the operator 1 without receiving direct instruction from the training instructor 12.
  • a skilled worker who does not have a successor saves sensor information and know-how information of his / her skill, so that the successor can be found after the skilled worker has diminished or died due to aging. Skills can be handed down when they appear.
  • FIG. 16 is a diagram for explaining the configuration of the work support apparatus according to the sixth embodiment of the present invention.
  • the work support apparatus 100f according to the sixth embodiment of the present invention is mainly different from the above-described fifth embodiment in that an AI (Artificial Intelligence) analysis unit 15 is provided.
  • AI Artificial Intelligence
  • the 16 includes an AI analysis unit 15.
  • the AI analysis unit 15 extracts know-how information from the sensor signal PSS stored in the information storage unit 10 and the conditions and evaluation results, and stores the extracted know-how information in the know-how storage unit 14.
  • the algorithm used for the analysis in the AI analysis unit 15 is supervised learning. A training set is created from the sensor signal PSS, the conditions, and the evaluation results, and learning is performed.
  • the AI analysis unit 15 may use, for example, a neural network or a Q learning algorithm.
  • the AI analysis unit 15 can extract tacit knowledge that is not recognized by the training instructor 12 and the worker 1 and transmit it to the worker 1.
  • FIG. 17 is a diagram for explaining the configuration of the work support apparatus according to the seventh embodiment of the present invention.
  • the work support device 100g according to the seventh embodiment of the present invention is mainly different from the above-described first embodiment in that the theoretical value setting unit 16 and the theoretical value calculation unit 17 are provided.
  • the description of the same configuration and operation as those in the first embodiment will be omitted, and a description of the different configuration and operation will be given below.
  • the 17 includes a theoretical value setting unit 16 and a theoretical value calculation unit 17.
  • the worker 1 sets the theoretical value of the physical quantity PQ1 of the work environment 102 and the theoretical value of the physical quantity PQ2 of the work object 103 in the theoretical value setting unit 16.
  • the theoretical value setting unit 16 corresponds to a second setting unit.
  • the theoretical value calculation unit 17 generates a sensor signal TSS that is theoretically calculated from the theoretical value set in the theoretical value setting unit 16, and outputs the sensor signal TSS to the sensor information conversion unit 3.
  • the theoretical value calculation unit 17 corresponds to the calculation unit.
  • the sensor information conversion unit 3 converts the sensor signal SS acquired by the sensor signal acquisition unit 2 into a feature amount FVa that can be sensed by a human sensory organ by calculation or the like.
  • the sensor information conversion unit 3 converts the sensor signal TSS input from the theoretical value calculation unit 17 into a feature amount FVb that can be sensed by a human sensory organ by calculation or the like.
  • the sensory information output unit 4 outputs sensory information SIa including the feature amount FVa and sensory information SIb including the feature amount FVb converted by the sensor information conversion unit 3 to the sensory generation device group 104, respectively.
  • the sensation generating device group 104 transmits the feature value FVa and the feature value FVb to the sensory organ of the worker 1.
  • the worker 1 compares the feature quantity FVa extracted from the actual sensor signal SS with the feature quantity FVb extracted from the sensor signal TSS theoretically calculated from the theoretical value. This makes it possible for the worker 1 to acquire tacit knowledge that cannot be calculated theoretically from theoretical values.
  • FIG. 18 is a diagram illustrating an example of a hardware configuration of the work support devices 100a, 100b, 100c, 100d, 100e, 100f, and 100g (hereinafter referred to as “work support device 100”) in each of the above-described embodiments. is there.
  • the work support apparatus 100 includes an input / output interface circuit 201 including an input circuit for inputting information from the outside of the work support apparatus 100 and an output circuit for outputting the information to the outside of the work support apparatus 100, a processor 202, and a memory 203.
  • the input / output interface circuit 201 sends information received from the outside to the memory 203.
  • the memory 203 stores information received from the input / output interface circuit 201.
  • the memory 203 stores a computer program.
  • the processor 202 reads a computer program stored in the memory 203 and performs arithmetic processing based on information stored in the memory 203. Calculation result information indicating the calculation result by the processor 202 is sent to the memory 203.
  • the input / output interface circuit 201 sends information stored in the memory 203 to the outside.
  • the configuration described in the above embodiment shows an example of the contents of the present invention, and can be combined with another known technique, and can be combined with other configurations without departing from the gist of the present invention. It is also possible to omit and change the part.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Manufacturing & Machinery (AREA)
  • Health & Medical Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Factory Administration (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

An operation assistance device (100a) is provided with: a sensor signal acquisition unit (2) for acquiring a sensor signal (SS) of a sensor group (101) which measures a physical amount of an operation environment (102) or an object to be operated (103); a sensor information conversion unit (3) for converting the sensor signal (SS) into a feature amount (FV) which can be sensed by a sensory organ of an operator (1); and a sensory information output unit (4) for outputting the feature amount (FV) so as to be capable of being sensed by the sensory organ of the operator (1).

Description

作業支援装置Work support device
 本発明は、未習熟の作業者の作業習熟を支援する作業支援装置に関する。 The present invention relates to a work support apparatus that supports work proficiency of unskilled workers.
 工場の作業現場において、作業者の新規雇用時、または新しい作業が導入される時などに、作業習熟のための訓練が実施される。作業習熟のための訓練は、手順書または指示書を暗記したり、練習用の製品を利用した同一の手順の反復であったり、現場で熟練作業者または監督者による指導を受けたりするなど、対象となる作業または作業を行う現場に応じて様々な形態がとられている。 In the factory work site, training for work proficiency is conducted when new workers are hired or new work is introduced. Training for work proficiency includes memorizing procedures or instructions, repeating the same procedure using a product for practice, or receiving guidance from a skilled worker or supervisor at the site, etc. Various forms are taken depending on the target work or the site where the work is performed.
 従来、未習熟の作業者の作業習熟を支援するための技術が提案されている。たとえば、特許文献1では、操作訓練の習熟度向上を評価することのできる操作訓練用装置が提案されている。 Conventionally, techniques for supporting work proficiency of unskilled workers have been proposed. For example, Patent Literature 1 proposes an operation training apparatus that can evaluate an improvement in the level of proficiency of operation training.
特開2002-287613号公報JP 2002-287613 A
 たとえばNC(数値制御:Numerical Control)工作機械に設定する加工条件の決定作業の熟練作業者は、加工対象の加工状態の差異を音の違いまたは切りくずの見え方の違いとして身体の感覚器官で感受し、必要な加工精度を実現するための加工条件を決定することができる。しかしながら、未習熟の作業者は、身体の感覚器官の訓練が十分に行われていないため、加工対象の加工状態のわずかな差異を身体の感覚器官で感受することができない。このため、未習熟の作業者は、熟練作業者が行う作業と同等の作業を行うことは難しい、という問題があった。 For example, a skilled worker who decides the machining conditions to be set on an NC (Numerical Control) machine tool can use the sensory organs of the body as a difference in the machining state of the machining object as a difference in sound or in the appearance of chips. The processing conditions for realizing the required processing accuracy can be determined. However, unskilled workers cannot sufficiently sense the slight difference in the processing state of the object to be processed by the body's sensory organs because the body's sensory organs are not sufficiently trained. For this reason, there is a problem that it is difficult for an unskilled worker to perform a work equivalent to a work performed by a skilled worker.
 本発明は、上記に鑑みてなされたものであって、未習熟の作業者であっても、熟練作業者が行う作業と同等の作業を行うことを可能とさせることができる作業支援装置を得ることを目的とする。 The present invention has been made in view of the above, and obtains a work support device capable of enabling even an unskilled worker to perform work equivalent to work performed by a skilled worker. For the purpose.
 上述した課題を解決し、目的を達成するために、本発明にかかる作業支援装置は、作業環境または作業対象物の状態量を測定するセンサのセンサ信号を取得する取得部と、センサ信号を作業者の感覚器官で感受可能な特徴量に変換する変換部と、特徴量を作業者の感覚器官が感受可能なように出力する出力部とを備える。 In order to solve the above-described problems and achieve the object, the work support device according to the present invention works with an acquisition unit that acquires a sensor signal of a sensor that measures a state quantity of a work environment or a work object, and a sensor signal. A conversion unit that converts the feature quantity into a feature quantity that can be sensed by the sensory organ of the person, and an output unit that outputs the feature quantity so that the sensory organ of the worker can sense it.
 本発明にかかる作業支援装置は、未習熟の作業者であっても、熟練作業者が行う作業と同等の作業を行うことを可能とさせることができるという効果を奏する。 The work support apparatus according to the present invention has an effect that even an unskilled worker can perform a work equivalent to a work performed by a skilled worker.
本発明の実施の形態1にかかる作業支援装置の構成を説明するための図The figure for demonstrating the structure of the work assistance apparatus concerning Embodiment 1 of this invention. 本発明の実施の形態1にかかる作業支援装置をNC工作機械における加工条件の決定作業に適用した実施例を説明するための図The figure for demonstrating the Example which applied the work assistance apparatus concerning Embodiment 1 of this invention to the determination work of the process condition in NC machine tool. 図2に示すNC工作機械の概略構成を示す斜視図The perspective view which shows schematic structure of the NC machine tool shown in FIG. 図2に示す力センサのセンサ信号から抽出されたベクトル情報の特徴量を表示した画像の一例を示す図The figure which shows an example of the image which displayed the feature-value of the vector information extracted from the sensor signal of the force sensor shown in FIG. 本発明の実施の形態2にかかる作業支援装置の構成を説明するための図The figure for demonstrating the structure of the work assistance apparatus concerning Embodiment 2 of this invention. 図5に示すセンサ情報変換部の構成を説明するための図The figure for demonstrating the structure of the sensor information conversion part shown in FIG. 本発明の実施の形態2にかかる作業支援装置をロボットのピックアンドプレースにおけるパラメータの調整作業に適用した実施例を説明するための図The figure for demonstrating the Example which applied the work assistance apparatus concerning Embodiment 2 of this invention to the adjustment work of the parameter in the pick and place of a robot. 図7に示すロボットの概略構成を示す斜視図The perspective view which shows schematic structure of the robot shown in FIG. 図7に示す感覚発生装置群と作業者の配置の一例を説明するための図The figure for demonstrating an example of arrangement | positioning of the sensory generation apparatus group shown in FIG. 7, and an operator. 本発明の実施の形態3にかかる作業支援装置の構成を説明するための図The figure for demonstrating the structure of the work assistance apparatus concerning Embodiment 3 of this invention. 本発明の実施の形態4にかかる作業支援装置の構成を説明するための図The figure for demonstrating the structure of the work assistance apparatus concerning Embodiment 4 of this invention. 図11に示す訓練評価入力部に入力する条件の入力画面の一例を示す図The figure which shows an example of the input screen of the conditions input into the training evaluation input part shown in FIG. 図11に示す情報保存部に保存された条件および訓練評価を作業者毎に表示装置に表示した画面の一例を示す図The figure which shows an example of the screen which displayed on the display apparatus the conditions preserve | saved at the information storage part shown in FIG. 11, and training evaluation for every operator. 本発明の実施の形態5にかかる作業支援装置の構成を説明するための図The figure for demonstrating the structure of the work assistance apparatus concerning Embodiment 5 of this invention. 図14に示す作業支援装置を使用した作業訓練の手順の一例を示したフローチャートThe flowchart which showed an example of the procedure of the work training using the work assistance apparatus shown in FIG. 本発明の実施の形態6にかかる作業支援装置の構成を説明するための図The figure for demonstrating the structure of the work assistance apparatus concerning Embodiment 6 of this invention. 本発明の実施の形態7にかかる作業支援装置の構成を説明するための図The figure for demonstrating the structure of the work assistance apparatus concerning Embodiment 7 of this invention. 各実施の形態における作業支援装置のハードウェア構成の一例を示す図The figure which shows an example of the hardware constitutions of the work assistance apparatus in each embodiment
 以下に、本発明の実施の形態にかかる作業支援装置を図面に基づいて詳細に説明する。なお、この実施の形態によりこの発明が限定されるものではない。 Hereinafter, a work support device according to an embodiment of the present invention will be described in detail with reference to the drawings. Note that the present invention is not limited to the embodiments.
実施の形態1.
 まず、本発明の実施の形態1にかかる作業支援装置について説明する。図1は、本発明の実施の形態1にかかる作業支援装置の構成を説明するための図である。
Embodiment 1 FIG.
First, the work support apparatus according to the first embodiment of the present invention will be described. FIG. 1 is a diagram for explaining the configuration of the work support apparatus according to the first embodiment of the present invention.
 図1に示す、作業支援装置100aは、未習熟の作業者1(以下、単に「作業者1」という)の作業習熟を支援する。作業支援装置100aは、センサ信号取得部2と、センサ情報変換部3と、感覚情報出力部4とを備える。センサ信号取得部2は、取得部に対応する。センサ情報変換部3は、変換部に対応する。感覚情報出力部4は、出力部に対応する。 The work support device 100a shown in FIG. 1 supports work proficiency of an unskilled worker 1 (hereinafter simply referred to as “worker 1”). The work support apparatus 100a includes a sensor signal acquisition unit 2, a sensor information conversion unit 3, and a sensory information output unit 4. The sensor signal acquisition unit 2 corresponds to an acquisition unit. The sensor information conversion unit 3 corresponds to a conversion unit. The sensory information output unit 4 corresponds to an output unit.
 センサ信号取得部2は、センサ群101の測定結果であるセンサ信号SSを取得する。センサ群101は、少なくとも1つのセンサで構成される。センサ群101を構成する一部のセンサは、作業環境102に設けられ、作業環境102の物理量PQ1を測定する。物理量PQ1は、状態量に対応する。センサ群101を構成する他のセンサは、作業対象物103に設けられ、作業対象物103の物理量PQ2を測定する。物理量PQ2は、状態量に対応する。センサ群101を構成するセンサは、作業環境102および作業対象物103のうちのいずれか一方にのみ設けられていてもよい。 The sensor signal acquisition unit 2 acquires a sensor signal SS that is a measurement result of the sensor group 101. The sensor group 101 includes at least one sensor. Some sensors constituting the sensor group 101 are provided in the work environment 102 and measure the physical quantity PQ1 of the work environment 102. The physical quantity PQ1 corresponds to the state quantity. The other sensors constituting the sensor group 101 are provided on the work target 103 and measure the physical quantity PQ2 of the work target 103. The physical quantity PQ2 corresponds to the state quantity. The sensors constituting the sensor group 101 may be provided only in one of the work environment 102 and the work object 103.
 センサ情報変換部3は、センサ信号取得部2が取得したセンサ信号SSを、演算などにより、人間の感覚器官で感受可能な特徴量FVに変換する。感覚情報出力部4は、センサ情報変換部3が変換した特徴量FVを含む感覚情報SIを、感覚発生装置群104へ出力する。感覚発生装置群104は、少なくとも1つの感覚発生装置で構成される。感覚発生装置群104は、特徴量FVを作業者1の感覚器官に伝達する。 The sensor information conversion unit 3 converts the sensor signal SS acquired by the sensor signal acquisition unit 2 into a feature amount FV that can be sensed by a human sensory organ by calculation or the like. The sensory information output unit 4 outputs sensory information SI including the feature amount FV converted by the sensor information converter 3 to the sensory generating device group 104. The sensation generating device group 104 includes at least one sensation generating device. The sensation generating device group 104 transmits the feature value FV to the sensory organ of the worker 1.
 作業者1は、熟練作業者が行う作業と同等の作業を行うことは難しい。本実施の形態によれば、作業者1の作業において、作業環境102および作業対象物103の物理量PQ1,PQ2を、特徴量FVとして作業者1が直感的に理解しやすい感覚器官に伝達させることができる。これにより、作業者1であっても、熟練作業者が行う作業と同等の作業を行うことを可能とさせることができる。 [Worker 1] It is difficult to perform work equivalent to work performed by skilled workers. According to the present embodiment, in the work of the worker 1, the physical quantities PQ1 and PQ2 of the work environment 102 and the work object 103 are transmitted as the feature quantity FV to a sensory organ that is easy for the worker 1 to understand intuitively. Can do. Thereby, even the worker 1 can perform work equivalent to the work performed by the skilled worker.
実施の形態1の実施例.
 次に、本発明の実施の形態1にかかる作業支援装置の実施例について説明する。図2は、本発明の実施の形態1にかかる作業支援装置をNC工作機械における加工条件の決定作業に適用した実施例を説明するための図である。図3は、図2に示すNC工作機械の概略構成を示す斜視図である。
Example of the first embodiment.
Next, an example of the work support apparatus according to the first embodiment of the present invention will be described. FIG. 2 is a diagram for explaining an example in which the work support apparatus according to the first embodiment of the present invention is applied to a machining condition determination work in an NC machine tool. FIG. 3 is a perspective view showing a schematic configuration of the NC machine tool shown in FIG.
 図3に示すNC工作機械5は、マシニングセンタと呼ばれるNC工作機械であって、工具を用いて、加工ワーク6の不要な部分を除去して加工ワーク6を目的の形状に加工する。NC工作機械5は、送り軸機構20x,20y,20zと、テーブル24と、コラム25と、ラム26と、主軸27とを備える。送り軸機構20xは、回転モータ21xと、送り軸である送りねじ22xと、回転角検出器23xとを備える。送り軸機構20xは、回転モータ21xの回転運動を送りねじ22xにより直線運動に変換する。送り軸機構20y,20zは、送り軸機構20xと同様であるため説明は省略する。NC工作機械5では、回転モータ21x,21y,21zのそれぞれのモータ回転角を、回転角検出器23x,23y,23zのそれぞれで検出し、回転モータ21x,21y,21zのそれぞれのモータ回転角をそれぞれ数値制御することにより、任意の3次元運動を実現することができる。主軸27の先端には、工具を取り付け可能であり、主軸27を回転させるとともに、工具とテーブル24上に載置された加工ワーク6との間の相対位置を制御することにより、加工ワーク6の不要な部分を除去して加工ワーク6を目的の形状に加工する。 The NC machine tool 5 shown in FIG. 3 is an NC machine tool called a machining center, and uses a tool to remove unnecessary portions of the workpiece 6 and to machine the workpiece 6 into a target shape. The NC machine tool 5 includes feed shaft mechanisms 20x, 20y, and 20z, a table 24, a column 25, a ram 26, and a main shaft 27. The feed shaft mechanism 20x includes a rotary motor 21x, a feed screw 22x that is a feed shaft, and a rotation angle detector 23x. The feed shaft mechanism 20x converts the rotational motion of the rotary motor 21x into linear motion by the feed screw 22x. Since the feed shaft mechanisms 20y and 20z are the same as the feed shaft mechanism 20x, description thereof is omitted. In the NC machine tool 5, the rotation angles of the rotation motors 21x, 21y, and 21z are detected by the rotation angle detectors 23x, 23y, and 23z, and the rotation angles of the rotation motors 21x, 21y, and 21z are determined. Arbitrary three-dimensional motion can be realized by numerical control of each. A tool can be attached to the tip of the main shaft 27, the main shaft 27 is rotated, and the relative position between the tool and the processing workpiece 6 placed on the table 24 is controlled, whereby the processing workpiece 6 is moved. Unnecessary portions are removed and the workpiece 6 is machined into a target shape.
 本実施例では、ラム26には、ラム26の振動の加速度を測定する加速度センサ71が取り付けられている。本実施例では、加工ワーク6には、送り軸である送りねじ22x,22y,22zとそれぞれ直交する3方向の加工力を測定する力センサ72が取り付けられている。 In this embodiment, the ram 26 is provided with an acceleration sensor 71 for measuring the acceleration of vibration of the ram 26. In the present embodiment, a force sensor 72 that measures a processing force in three directions orthogonal to the feed screws 22x, 22y, and 22z, which are feed shafts, is attached to the work piece 6.
 図2に示すように、作業支援装置100aをNC工作機械5における加工条件の決定作業に適用した場合、作業環境102はNC工作機械5となり、作業対象物103は加工ワーク6となり、センサ群101aを構成するセンサはNC工作機械5に設けられる加速度センサ71と加工ワーク6に設けられる力センサ72となり、感覚発生装置群104aを構成する感覚発生装置は振動発生装置81と表示装置82となる。 As shown in FIG. 2, when the work support device 100a is applied to the machining condition determination work in the NC machine tool 5, the work environment 102 becomes the NC machine tool 5, the work object 103 becomes the work piece 6, and the sensor group 101a. Are the acceleration sensor 71 provided on the NC machine tool 5 and the force sensor 72 provided on the workpiece 6. The sensory generators constituting the sensory generator group 104a are a vibration generator 81 and a display device 82.
 次に、NC工作機械5における加工条件の決定作業における、作業支援装置100aを用いた作業者1の作業習熟の支援の流れについて説明する。NC工作機械5における加工条件の決定作業は、新しい材料で作られた加工ワーク6を用いた製品加工、または新規のNC工作機械5の導入時などにおいて、NC工作機械5で加工ワーク6を加工する際にNC工作機械5に設定する加工条件を決定する作業である。作業者1は、加工条件の決定作業においては、加工ワーク6の加工面精度の目標値を実現するための回転数、送り速度、切り込み量、およびピックフィード量の各パラメータを決定する。回転数、送り速度、切り込み量、およびピックフィード量の各パラメータは、加工条件に対応する。 Next, the flow of support for work proficiency of the worker 1 using the work support apparatus 100a in the work for determining the machining conditions in the NC machine tool 5 will be described. The machining conditions for the NC machine tool 5 are determined by machining the product workpiece 6 using the workpiece 6 made of a new material, or by machining the workpiece 6 with the NC machine tool 5 when a new NC machine tool 5 is introduced. This is an operation for determining the machining conditions to be set in the NC machine tool 5 at the time. In the work condition determination work, the worker 1 determines parameters for the rotational speed, feed speed, cutting amount, and pick feed amount for realizing the target value of the machined surface accuracy of the work 6. Each parameter of the rotation speed, feed speed, cutting amount, and pick feed amount corresponds to the processing conditions.
 まず、作業者1は、工具メーカの推奨する加工条件をNC工作機械5に設定する。NC工作機械5は、設定された加工条件である各パラメータに基づいて、主軸27の単位時間あたりの回転数、および送り軸である送りねじ22x,22y,22zの運動軌跡に関する加工プログラムを生成する。 First, the worker 1 sets the machining conditions recommended by the tool manufacturer in the NC machine tool 5. The NC machine tool 5 generates a machining program related to the number of rotations per unit time of the main shaft 27 and the motion trajectory of the feed screws 22x, 22y, and 22z that are feed axes, based on each parameter that is a set machining condition. .
 NC工作機械5は、加工プログラムに基づいて、主軸27、および送り軸である送りねじ22x,22y,22zの運動を制御して、加工ワーク6の加工を行う。加速度センサ71は、加工中に発生する振動の加速度を測定する。力センサ72は、加工中に発生する加工力を測定する。 The NC machine tool 5 processes the workpiece 6 by controlling the movements of the main shaft 27 and the feed screws 22x, 22y, and 22z, which are feed axes, based on the machining program. The acceleration sensor 71 measures the acceleration of vibration generated during machining. The force sensor 72 measures a machining force generated during machining.
 作業支援装置100aのセンサ信号取得部2は、加速度センサ71の測定結果であるセンサ信号SSaを取得する。作業支援装置100aのセンサ情報変換部3は、センサ信号取得部2が取得したセンサ信号SSaにフィルタ処理を施して加速度に関する特徴量に変換する。本実施例では、たとえばローパスフィルタによって高周波の振動成分が除去される。作業支援装置100aの感覚情報出力部4は、加速度に関する特徴量を含む感覚情報SIaを、振動発生装置81へ出力する。振動発生装置81は、加速度に関する特徴量を、作業者1の感覚器官に伝達する。振動発生装置81は、ゲーム用のコントローラまたは携帯電話に使用されるバイブレータが例示される。振動発生装置81は、加速度に関する特徴量の時間波形を振動として出力する。これにより、加工中に発生した振動を、作業者1の触覚に伝達させることが可能となる。 The sensor signal acquisition unit 2 of the work support apparatus 100a acquires a sensor signal SSa that is a measurement result of the acceleration sensor 71. The sensor information conversion unit 3 of the work support device 100a performs a filtering process on the sensor signal SSa acquired by the sensor signal acquisition unit 2 to convert the sensor signal SSa into a feature value related to acceleration. In this embodiment, for example, a high-frequency vibration component is removed by a low-pass filter. The sensory information output unit 4 of the work support device 100 a outputs sensory information SIa including a feature amount related to acceleration to the vibration generator 81. The vibration generating device 81 transmits a feature amount related to acceleration to the sensory organ of the worker 1. The vibration generator 81 is exemplified by a game controller or a vibrator used for a mobile phone. The vibration generator 81 outputs a time waveform of a feature amount related to acceleration as vibration. As a result, vibration generated during processing can be transmitted to the tactile sense of the operator 1.
 センサ信号取得部2は、力センサ72の測定結果であるセンサ信号SSbを取得する。センサ情報変換部3は、センサ信号取得部2が取得したセンサ信号SSbを、力の向きと力の大きさとを示したベクトル情報の特徴量に変換する。感覚情報出力部4は、力のベクトル情報の特徴量を含む感覚情報SIbを、表示装置82へ出力する。表示装置82は、力のベクトル情報の特徴量を、作業者1の感覚器官に伝達する。表示装置82は、ビデオモニタ、またはウエアラブルグラスが例示される。表示装置82は、力のベクトル情報の特徴量を表示した画像を出力する。これにより、加工中に発生した加工力を、作業者1の視覚に伝達させることが可能となる。図4は、図2に示す力センサ72のセンサ信号SSbから抽出されたベクトル情報の特徴量を表示した画像の一例を示す図である。図4に示す画像では、力センサ72により10ms毎に測定された力の向きと力の大きさとが矢印で、すなわちベクトルで連続的に表示されており、力の大きさが大きい矢印が実線の矢印で示され、力の大きさが中間の矢印が破線の矢印で示され、力の大きさが小さい矢印が点線の矢印で示されている。図4に示す画像を見た作業者1は、実線の矢印で示された部分について、加工力が大きいために加工状態に差異が生じる可能性を認識することができる。 The sensor signal acquisition unit 2 acquires a sensor signal SSb that is a measurement result of the force sensor 72. The sensor information conversion unit 3 converts the sensor signal SSb acquired by the sensor signal acquisition unit 2 into a feature amount of vector information indicating the direction of force and the magnitude of force. The sensory information output unit 4 outputs sensory information SIb including the feature amount of the force vector information to the display device 82. The display device 82 transmits the feature amount of the force vector information to the sensory organ of the worker 1. The display device 82 is exemplified by a video monitor or wearable glasses. The display device 82 outputs an image displaying the feature amount of the force vector information. Thereby, the processing force generated during the processing can be transmitted to the visual sense of the worker 1. FIG. 4 is a diagram showing an example of an image displaying the feature amount of the vector information extracted from the sensor signal SSb of the force sensor 72 shown in FIG. In the image shown in FIG. 4, the direction of the force and the magnitude of the force measured every 10 ms by the force sensor 72 are continuously displayed as arrows, that is, vectors, and the arrow having a large force is a solid line. An arrow having an intermediate magnitude of force is indicated by a dashed arrow, and an arrow having a small magnitude is indicated by a dotted arrow. The operator 1 who has seen the image shown in FIG. 4 can recognize the possibility that a difference occurs in the machining state because the machining force is large at the portion indicated by the solid line arrow.
 NC工作機械5に設定する加工条件の決定作業を行う熟練作業者は、加工ワーク6の加工状態の差異を音の違いまたは切りくずの見え方の違いとして身体の感覚器官で感受し、必要な加工精度を実現するための加工条件を決定することができる。しかしながら、作業者1は、身体の感覚器官の訓練が十分に行われていないため、加工ワーク6の加工状態のわずかな差異を身体の感覚器官で感受することができない。本実施例によれば、作業者1の作業において、加速度に関する特徴量とベクトル情報の特徴量とを作業者1の感覚器官に伝達させることにより、加工中に発生した振動と加工力とを作業者1に認識させることができる。これにより、作業者1であっても、熟練作業者が行う作業と同等の作業を行うことを可能とさせることができる。すなわち、作業者1は、熟練作業者が加工中に発生する音、切り屑の飛び方などから判断している加工情報を振動と視覚情報として得られるので、感覚器官での認知能力を高めることなく加工状態を判断することが可能となる。作業者1は、加工結果と加工情報とを対比させることで異常または加工誤差が発生している箇所が特定できるようになる。これにより、作業者1は、異常または加工誤差が発生しないように加工条件を決定することができるようになり、適切な加工条件を決定することが可能になる。 A skilled worker who determines the machining conditions to be set in the NC machine tool 5 senses the difference in the machining state of the workpiece 6 with the sense organs of the body as a difference in sound or a difference in the appearance of chips. Processing conditions for realizing processing accuracy can be determined. However, since the training of the body sensory organ is not sufficiently performed, the worker 1 cannot sense a slight difference in the machining state of the workpiece 6 with the body sensory organ. According to the present embodiment, in the work of the worker 1, the vibration and the machining force generated during the machining are performed by transmitting the feature quantity related to acceleration and the feature quantity of vector information to the sensory organ of the worker 1. 1 can be made to recognize. Thereby, even the worker 1 can perform work equivalent to the work performed by the skilled worker. That is, the worker 1 can obtain the processing information determined by the skilled worker based on the sound generated during the processing, how to fly the chips, etc. as vibration and visual information, so that the cognitive ability in the sensory organ is enhanced. Therefore, it is possible to determine the machining state. The operator 1 can identify a location where an abnormality or a machining error has occurred by comparing the machining result with the machining information. As a result, the operator 1 can determine the machining conditions so that no abnormality or machining error occurs, and can determine appropriate machining conditions.
 本実施例では、NC工作機械5は設定された加工条件である各パラメータに基づいて、加工プログラムを生成しているが、加工条件である各パラメータの設定結果から加工プログラムを生成するCAM(Computer Aided Manufacturing)ソフトウエアによって作成された加工プログラムが転送されてもよい。 In this embodiment, the NC machine tool 5 generates a machining program based on each parameter that is a set machining condition. However, a CAM (Computer) that generates a machining program from the setting result of each parameter that is a machining condition. A machining program created by (Aided Manufacturing) software may be transferred.
 本実施例では、作業支援装置100aをフライス型のマシニングセンタであるNC工作機械5に適用しているが、作業支援装置100aを旋盤型のNC工作機械、5軸工作機械、複合加工機、レーザ加工機、または放電加工機などのフライス型のマシニングセンタとは異なる機械構成、または異なる加工原理を用いて目的の形状を創成するNC工作機械に適用してもよい。 In this embodiment, the work support device 100a is applied to the NC machine tool 5 which is a milling type machining center. However, the work support device 100a is a lathe-type NC machine tool, a 5-axis machine tool, a multi-task machine, laser processing. The present invention may be applied to an NC machine tool that creates a target shape using a machine configuration different from a milling type machining center such as a machine or an electric discharge machine, or using a different machining principle.
 本実施例では、フィルタ処理を、ローパスフィルタで行っているが、ローパスフィルタ、ハイパスフィルタ、バンドパスフィルタ、バンドエリミネイトフィルタのいずれか1つまたは、それらの2つ以上の組み合わせで行ってもよい。また、特徴量は、複数のセンサ信号をベクトルとして統合して抽出してもよく、3軸センサの信号を1次元のスカラー量として抽出してもよい。 In this embodiment, the filtering process is performed by a low-pass filter, but may be performed by any one of a low-pass filter, a high-pass filter, a band-pass filter, a band-eliminate filter, or a combination of two or more thereof. . The feature amount may be extracted by integrating a plurality of sensor signals as a vector, or the signal of the three-axis sensor may be extracted as a one-dimensional scalar amount.
 本実施例では、センサ群101aを構成するセンサとして加速度センサ71と力センサ72とを使用しているが、温度センサ、レーザ変位計、ドップラー振動計などを使用してもよい。本実施例では、同一の対象に複数種類のセンサが設けられてもよく、同一の対象に同一種類のセンサが複数設けられていてもよい。 In this embodiment, the acceleration sensor 71 and the force sensor 72 are used as sensors constituting the sensor group 101a, but a temperature sensor, a laser displacement meter, a Doppler vibrometer, or the like may be used. In the present embodiment, a plurality of types of sensors may be provided on the same object, or a plurality of sensors of the same type may be provided on the same object.
 本実施例では、感覚発生装置群104aを構成する感覚発生装置として振動発生装置81と表示装置82とを使用しているが、音発生装置などの感覚発生装置を使用してもよい。 In this embodiment, the vibration generating device 81 and the display device 82 are used as the sensation generating devices constituting the sensation generating device group 104a, but a sensation generating device such as a sound generating device may be used.
 本実施例では、感覚情報出力部4は、感覚情報SIa,SIbをセンサ信号SSa,SSbの取得タイミングから遅らせて出力してもよい。感覚情報出力部4は、感覚情報SIa,SIbを繰り返し出力してもよい。感覚情報出力部4は、感覚情報SIa,SIbを、センサ信号SSa,SSbの測定に要した時間よりも短い時間または長い時間で出力してもよい。 In this embodiment, the sensory information output unit 4 may output sensory information SIa and SIb with a delay from the acquisition timing of the sensor signals SSa and SSb. Sensory information output part 4 may output sensory information SIa and SIb repeatedly. The sensory information output unit 4 may output sensory information SIa and SIb in a time shorter or longer than a time required for measuring the sensor signals SSa and SSb.
実施の形態2.
 次に、本発明の実施の形態2にかかる作業支援装置について説明する。図5は、本発明の実施の形態2にかかる作業支援装置の構成を説明するための図である。図6は、図5に示すセンサ情報変換部の構成を説明するための図である。本発明の実施の形態2にかかる作業支援装置100bは、出力方式設定部7を備える点、およびセンサ情報変換部3の構成が異なる点が、上述した実施の形態1と主に異なる。実施の形態1と重複した構成および作用については説明を省略し、以下に異なる構成および作用についての説明を行う。
Embodiment 2. FIG.
Next, a work support apparatus according to the second embodiment of the present invention will be described. FIG. 5 is a diagram for explaining the configuration of the work support apparatus according to the second embodiment of the present invention. FIG. 6 is a diagram for explaining the configuration of the sensor information conversion unit shown in FIG. The work support apparatus 100b according to the second embodiment of the present invention is mainly different from the first embodiment described above in that the output method setting unit 7 is provided and the configuration of the sensor information conversion unit 3 is different. A description of the same configuration and operation as those in the first embodiment will be omitted, and a description of a different configuration and operation will be given below.
 図5に示す作業支援装置100bは、出力方式設定部7を備える。作業者1は、出力方式設定部7に、特徴量の出力方式を設定する。出力方式設定部7は、特徴量の出力方式の設定指令SCをセンサ情報変換部3に出力する。出力方式設定部7は、第1の設定部に対応する。 The work support device 100b shown in FIG. The worker 1 sets the output method of the feature amount in the output method setting unit 7. The output method setting unit 7 outputs a feature amount output method setting command SC to the sensor information conversion unit 3. The output method setting unit 7 corresponds to the first setting unit.
 図6に示すセンサ情報変換部3は、感覚情報設定部31と、アルゴリズム保存部32と、変換演算実行部33とを備える。出力方式設定部7が出力した設定指令SCは、感覚情報設定部31に入力される。感覚情報設定部31は、入力された設定指令SCに基づいて、特徴量の演算で用いる演算式の選択指令CCをアルゴリズム保存部32へ出力する。アルゴリズム保存部32は、入力された選択指令CCに基づいて、アルゴリズム保存部32に保存された演算式から特徴量の演算で用いる演算式を選択し、選択した演算式の情報AEIを変換演算実行部33に出力する。変換演算実行部33は、センサ信号SSと、演算式の情報AEIとから、特徴量FVの演算を行い、特徴量FVを感覚情報出力部4に出力する。感覚情報出力部4は、設定指令SCと特徴量FVに対応した感覚発生装置を選択し、選択した感覚発生装置に感覚情報SIを出力する。 6 includes a sensory information setting unit 31, an algorithm storage unit 32, and a conversion calculation execution unit 33. The setting command SC output from the output method setting unit 7 is input to the sensory information setting unit 31. The sensory information setting unit 31 outputs to the algorithm storage unit 32 a selection command CC for an arithmetic expression used in the calculation of the feature amount based on the input setting command SC. Based on the input selection command CC, the algorithm storage unit 32 selects an arithmetic expression to be used for calculating the feature amount from the arithmetic expressions stored in the algorithm storage unit 32, and executes the conversion operation on the information AEI of the selected arithmetic expression. To the unit 33. The conversion calculation execution unit 33 calculates the feature value FV from the sensor signal SS and the calculation formula information AEI, and outputs the feature value FV to the sensory information output unit 4. The sensory information output unit 4 selects a sensory generation device corresponding to the setting command SC and the feature amount FV, and outputs sensory information SI to the selected sensory generation device.
 本実施の形態によれば、作業者1は、特徴量の出力方式を設定することが可能となるので、作業者1の最適な感覚器官に特徴量を伝達させることができる。 According to the present embodiment, the worker 1 can set the feature value output method, and can transmit the feature value to the optimal sensory organ of the worker 1.
実施の形態2の実施例.
 次に、本発明の実施の形態2にかかる作業支援装置の実施例について説明する。図7は、本発明の実施の形態2にかかる作業支援装置をロボットのピックアンドプレースにおけるパラメータの調整作業に適用した実施例を説明するための図である。図8は、図7に示すロボットの概略構成を示す斜視図である。図9は、図7に示す感覚発生装置群と作業者の配置の一例を説明するための図である。
Example of the second embodiment.
Next, an example of the work support apparatus according to the second embodiment of the present invention will be described. FIG. 7 is a diagram for explaining an example in which the work support apparatus according to the second embodiment of the present invention is applied to a parameter adjustment work in a robot pick-and-place. FIG. 8 is a perspective view showing a schematic configuration of the robot shown in FIG. FIG. 9 is a diagram for explaining an example of the arrangement of the sensation generating device group and workers shown in FIG.
 図8に示すロボット8は、対象ワーク9を把持し、対象ワーク9を移動させる。ロボット8は、ロボットハンド51と、ロボットアーム52と、減速機付きモータ53とを備える。ロボット8では、ロボットハンド51と減速機付きモータ53との動作は、ロボットコントローラ73で数値制御される。 The robot 8 shown in FIG. 8 grips the target workpiece 9 and moves the target workpiece 9. The robot 8 includes a robot hand 51, a robot arm 52, and a motor 53 with a speed reducer. In the robot 8, the operations of the robot hand 51 and the motor 53 with a speed reducer are numerically controlled by the robot controller 73.
 本実施例では、ロボットハンド51には、ロボットハンド51に生じる直交する3方向の力と、ロボットハンド51の各軸のトルクを測定する力覚センサ74とが取り付けられている。 In this embodiment, the robot hand 51 is attached with force in three directions perpendicular to the robot hand 51 and a force sensor 74 for measuring the torque of each axis of the robot hand 51.
 図7に示すように、作業支援装置100bをロボット8のピックアンドプレースにおけるパラメータの調整作業に適用した場合、作業環境102はロボット8となり、作業対象物103は対象ワーク9となり、センサ群101bを構成するセンサはロボット8のロボットコントローラ73とロボットハンド51に設けられる力覚センサ74となり、感覚発生装置群104bを構成する感覚発生装置は振動発生装置81と表示装置82と音発生装置83と力覚発生装置84となる。 As shown in FIG. 7, when the work support apparatus 100b is applied to parameter adjustment work in pick and place of the robot 8, the work environment 102 is the robot 8, the work object 103 is the target work 9, and the sensor group 101b is set. The constituting sensors are the robot controller 73 of the robot 8 and the force sensor 74 provided in the robot hand 51, and the sensation generating devices constituting the sensation generating device group 104b are the vibration generating device 81, the display device 82, the sound generating device 83, and the force. It becomes the sense generating device 84.
 次に、ロボット8のピックアンドプレースにおけるパラメータの調整作業における、作業支援装置100bを用いた作業者1の作業習熟の支援の流れについて説明する。図9に示すように、作業者1は、振動発生装置81aと、表示装置82aと、音発生装置83aとを装着する。ロボットコントローラ73には、音発生装置83bと、力覚発生装置84aが取り付けられている。 Next, a description will be given of the flow of support for work proficiency of the worker 1 using the work support device 100b in the parameter adjustment work in the pick and place of the robot 8. FIG. As shown in FIG. 9, the worker 1 wears a vibration generating device 81a, a display device 82a, and a sound generating device 83a. A sound generator 83b and a force sense generator 84a are attached to the robot controller 73.
 まず、作業者1は、入力装置を用いて、作業支援装置100bの出力方式設定部7に、特徴量の出力方式を設定する。入力装置は、キーボード、マウス、ボタン、またはスイッチが例示される。たとえば、作業者1が、力覚情報の特徴量を音として伝達させることを設定する場合、作業者1は、力覚情報の特徴量を音として音発生装置83で出力することを設定する。また、作業者1は、抽出する帯域を設定する。これにより、変換演算実行部33では、力覚センサ74のセンサ信号SSdに対して、作業者1が設定した帯域の信号を分離するフィルタ処理が施されて、特徴量FVが演算されることになる。 First, the worker 1 sets the feature value output method in the output method setting unit 7 of the work support device 100b using the input device. The input device is exemplified by a keyboard, a mouse, a button, or a switch. For example, when the worker 1 is set to transmit the feature amount of the haptic information as a sound, the worker 1 is set to output the feature amount of the haptic information as a sound by the sound generator 83. In addition, the worker 1 sets a band to be extracted. As a result, the conversion calculation execution unit 33 performs filter processing for separating the signal in the band set by the operator 1 on the sensor signal SSd of the force sensor 74 to calculate the feature value FV. Become.
 作業者1は、ロボット8のピックアンドプレースにおけるパラメータを決定する。ロボットコントローラ73は、ロボット8のピックアンドプレースにおける、減速機付きモータ53のそれぞれの電流フィードバック値の信号、およびそれぞれのモータ角度の信号を取得する。力覚センサ74は、ロボット8のピックアンドプレースにおける、ロボットハンド51に生じる直交する3方向の力と、ロボットハンド51の各軸のトルクとを測定する。 Worker 1 determines the pick-and-place parameters of the robot 8. The robot controller 73 acquires a current feedback value signal and a motor angle signal of each motor 53 with a speed reducer in the pick-and-place of the robot 8. The force sensor 74 measures three orthogonal forces generated in the robot hand 51 and the torque of each axis of the robot hand 51 in the pick-and-place of the robot 8.
 作業支援装置100bのセンサ信号取得部2は、力覚センサ74の測定結果であるセンサ信号SSdを取得する。作業支援装置100bのセンサ情報変換部3は、センサ信号取得部2が取得したセンサ信号SSdに、作業者1が設定した帯域の信号を分離するフィルタ処理を施して、音情報の特徴量に変換する。作業支援装置100aの感覚情報出力部4は、音情報の特徴量を含む感覚情報SIcを、音発生装置83へ出力する。音発生装置83は、音情報の特徴量を、作業者1の感覚器官に伝達する。音発生装置83は、イヤホンが例示され、音情報の特徴量を音の強弱で出力する。これにより、作業中に発生したロボットハンド51の把持力を、音情報の特徴量で作業者1の聴覚に伝達させることが可能となる。 The sensor signal acquisition unit 2 of the work support apparatus 100b acquires a sensor signal SSd that is a measurement result of the force sensor 74. The sensor information conversion unit 3 of the work support device 100b performs filtering processing on the sensor signal SSd acquired by the sensor signal acquisition unit 2 to separate the signal in the band set by the worker 1 and converts it into a feature amount of sound information. To do. The sensory information output unit 4 of the work support device 100a outputs sensory information SIc including the feature amount of the sound information to the sound generator 83. The sound generator 83 transmits the feature amount of the sound information to the sensory organ of the worker 1. The sound generator 83 is exemplified by an earphone, and outputs the feature amount of the sound information with the strength of the sound. As a result, the gripping force of the robot hand 51 generated during the work can be transmitted to the auditory sense of the worker 1 using the feature amount of the sound information.
 作業者1毎に手先の感覚または空間把握能力が異なるため、特徴量の出力方式を一義的な方式として作業支援を行うことは適切でない。本実施例では、作業者1は、入力装置によって、特徴量の出力方式を設定することが可能となる。これにより、身体能力、好み、感覚などが異なる作業者1であっても個人に最適な感覚器官へ特徴量を伝達することが可能となり、様々な人材を作業に従事させることが可能となる。 ∙ Since the sense of the hand or the ability to grasp the space is different for each worker 1, it is not appropriate to provide work support using a unique feature output method. In the present embodiment, the worker 1 can set the output method of the feature amount by using the input device. Thereby, even if the worker 1 has different physical abilities, preferences, senses, and the like, it is possible to transmit the feature quantity to the most suitable sensory organ, and it is possible to engage various human resources in the work.
 本実施例では、作業者1は、感覚発生装置群104bの他の感覚発生装置に、力覚センサ74のセンサ信号SSdから抽出した特徴量を出力することも可能である。これにより、たとえば、聴力が低下した作業者1が作業を行う際は、視覚情報として作業の状態量を表示したり、視力が低下してきた高齢の作業者1が作業を行う際は、振動情報として振動を伝達したりすることができる。 In this embodiment, the worker 1 can also output the feature amount extracted from the sensor signal SSd of the force sensor 74 to another sensory generator of the sensory generator group 104b. Thereby, for example, when the worker 1 whose hearing ability has decreased, the state quantity of the work is displayed as visual information, or when the elderly worker 1 whose visual acuity has decreased, the vibration information Can transmit vibration.
 本実施例では、聴覚、視覚、触覚を用いた感覚発生装置を使用しているが、たとえば、匂いを発生する芳香剤などを複数用意し、匂いの差異で把持力を伝達する嗅覚発生装置、芳香剤を密封した筐体の排気ファンなどの強さを制御して、匂いの強弱で把持力を伝達する嗅覚発生装置などを使用してもよい。さらに、口腔内に電気刺激を印加することで味覚を変化させて把持力を伝達する味覚発生装置を使用してもよい。 In this example, a sensory generation device using auditory, visual, and tactile sensations is used.For example, a plurality of fragrances that generate odors are prepared, and an olfactory generation device that transmits gripping force by odor difference, It is also possible to use an olfactory generating device that controls the strength of an exhaust fan or the like of a casing in which a fragrance is sealed, and transmits gripping force with the intensity of the odor. Furthermore, you may use the taste generating apparatus which transmits a grasping force by changing a taste by applying electrical stimulation in the oral cavity.
実施の形態3.
 次に、本発明の実施の形態3にかかる作業支援装置について説明する。図10は、本発明の実施の形態3にかかる作業支援装置の構成を説明するための図である。本発明の実施の形態3にかかる作業支援装置100cは、情報保存部10を備える点が、上述した実施の形態1と主に異なる。実施の形態1と重複した構成および作用については説明を省略し、以下に異なる構成および作用についての説明を行う。
Embodiment 3 FIG.
Next, a work support apparatus according to Embodiment 3 of the present invention will be described. FIG. 10 is a diagram for explaining the configuration of the work support apparatus according to the third embodiment of the present invention. The work support apparatus 100c according to the third embodiment of the present invention is mainly different from the above-described first embodiment in that the information storage unit 10 is provided. The description of the same configuration and operation as those in the first embodiment will be omitted, and a description of the different configuration and operation will be given below.
 図10に示す作業支援装置100cは、情報保存部10を備える。情報保存部10は、センサ信号取得部2が現在の作業において取得したセンサ信号CSSと、センサ信号取得部2が過去の作業において取得したセンサ信号PSSと保存する。情報保存部10は、第1の保存部に対応する。センサ信号CSSは、第1のセンサ信号に対応する。センサ信号PSSは、第2のセンサ信号に対応する。 The work support apparatus 100c shown in FIG. The information storage unit 10 stores the sensor signal CSS acquired by the sensor signal acquisition unit 2 in the current operation and the sensor signal PSS acquired by the sensor signal acquisition unit 2 in the past operation. The information storage unit 10 corresponds to the first storage unit. The sensor signal CSS corresponds to the first sensor signal. The sensor signal PSS corresponds to the second sensor signal.
 センサ情報変換部3は、センサ信号取得部2が取得したセンサ信号CSSを、演算などにより、人間の感覚器官で感受可能な特徴量FVaに変換する。情報保存部10は、特徴量FVaを保存してもよい。センサ情報変換部3は、情報保存部10に保存されているセンサ信号PSSを、演算などにより、人間の感覚器官で感受可能な特徴量FVbに変換する。情報保存部10は、特徴量FVbを保存してもよい。感覚情報出力部4は、センサ情報変換部3が変換した特徴量FVaを含む感覚情報SIaと特徴量FVbを含む感覚情報SIbとを、感覚発生装置群104へそれぞれ出力する。感覚発生装置群104は、特徴量FVaと特徴量FVbとを作業者1の感覚器官に伝達する。 The sensor information conversion unit 3 converts the sensor signal CSS acquired by the sensor signal acquisition unit 2 into a feature amount FVa that can be sensed by a human sensory organ by calculation or the like. The information storage unit 10 may store the feature amount FVa. The sensor information conversion unit 3 converts the sensor signal PSS stored in the information storage unit 10 into a feature amount FVb that can be sensed by a human sensory organ by calculation or the like. The information storage unit 10 may store the feature amount FVb. The sensory information output unit 4 outputs sensory information SIa including the feature amount FVa and sensory information SIb including the feature amount FVb converted by the sensor information conversion unit 3 to the sensory generation device group 104, respectively. The sensation generating device group 104 transmits the feature value FVa and the feature value FVb to the sensory organ of the worker 1.
 本実施の形態によれば、作業者1の作業において、現在の作業における作業環境102および作業対象物103の物理量PQ1,PQ2を、特徴量FVaとして作業者1の感覚器官に伝達させることができるとともに、過去の作業における作業環境102および作業対象物103の物理量PQ1,PQ2を、特徴量FVbとして作業者1の感覚器官に伝達させることができる。これにより、現在の作業と過去の作業との差を作業者1に認識させることができる。 According to the present embodiment, in the work of the worker 1, the work environment 102 in the current work and the physical quantities PQ1 and PQ2 of the work object 103 can be transmitted to the sensory organ of the worker 1 as the feature quantity FVa. At the same time, the physical quantities PQ1 and PQ2 of the work environment 102 and the work object 103 in the past work can be transmitted to the sensory organ of the worker 1 as the feature quantity FVb. Thereby, the worker 1 can be made to recognize the difference between the current work and the past work.
 本実施の形態では、感覚発生装置群104を構成する感覚発生装置は、過去の特徴量FVbと現在の特徴量FVaとを作業者1に同時に伝達してもよいし、順番に伝達してもよい。感覚発生装置群104を構成する感覚発生装置は、過去の特徴量FVbと現在の特徴量FVaとの差分を作業者1に伝達してもよい。 In the present embodiment, the sensation generating devices constituting the sensation generating device group 104 may transmit the past feature quantity FVb and the current feature quantity FVa to the worker 1 simultaneously or sequentially. Good. The sensation generating devices constituting the sensation generating device group 104 may transmit the difference between the past feature value FVb and the current feature value FVa to the worker 1.
 感覚発生装置として表示装置82を使用する場合は、過去の特徴量FVbと現在の特徴量FVaとを重ねて表示してもよい。この場合、たとえば作業者1が現在設定した加工条件と過去に設定した加工条件との効果の差異を容易に理解することが可能となる。感覚発生装置として音発生装置83を使用する場合は、過去の特徴量FVbと現在の特徴量FVaとの差分を作業者1に音量の強弱で伝達してもよい。 When the display device 82 is used as the sense generating device, the past feature quantity FVb and the current feature quantity FVa may be displayed in an overlapping manner. In this case, for example, it is possible to easily understand the difference in effect between the machining conditions currently set by the worker 1 and the machining conditions set in the past. When the sound generator 83 is used as the sensation generator, the difference between the past feature value FVb and the current feature value FVa may be transmitted to the worker 1 with the volume level.
実施の形態4.
 次に、本発明の実施の形態4にかかる作業支援装置について説明する。図11は、本発明の実施の形態4にかかる作業支援装置の構成を説明するための図である。本発明の実施の形態4にかかる作業支援装置100dは、訓練評価入力部11を備える点が、上述した実施の形態3と主に異なる。実施の形態3と重複した構成および作用については説明を省略し、以下に異なる構成および作用についての説明を行う。
Embodiment 4 FIG.
Next, a work support apparatus according to Embodiment 4 of the present invention will be described. FIG. 11 is a diagram for explaining the configuration of the work support apparatus according to the fourth embodiment of the present invention. The work support apparatus 100d according to the fourth embodiment of the present invention is mainly different from the above-described third embodiment in that it includes a training evaluation input unit 11. The description of the same configuration and operation as those of the third embodiment will be omitted, and a description of the different configuration and operation will be given below.
 図11に示す作業支援装置100dは、訓練評価入力部11を備える。訓練指導者12は、訓練評価入力部11に、入力装置を用いて、作業者1の現在の作業についての、条件および訓練評価を入力する。訓練評価入力部11は、第1の入力部に対応する。入力装置は、キーボード、マウス、ボタン、またはスイッチが例示される。図12は、図11に示す訓練評価入力部11に入力する条件の入力画面の一例を示す図である。図12では、上述したNC工作機械5における加工条件の入力画面が示されている。訓練指導者12が、訓練評価入力部11に入力する訓練評価は、作業者1が行った作業が良好であるか不良であるかである。訓練指導者12が、訓練評価入力部11に入力する訓練評価は、合と否のような2値の入力でもよいし、数字による5段階評価であってもよく、100点満点での評価であってもよい。訓練評価入力部11に入力された条件および訓練評価は、情報保存部10において、現在のセンサ信号CSSと関連付けされて保存される。 The work support device 100d shown in FIG. The training instructor 12 inputs a condition and training evaluation for the current work of the worker 1 to the training evaluation input unit 11 using the input device. The training evaluation input unit 11 corresponds to the first input unit. The input device is exemplified by a keyboard, a mouse, a button, or a switch. FIG. 12 is a diagram illustrating an example of an input screen for conditions input to the training evaluation input unit 11 illustrated in FIG. 11. FIG. 12 shows an input screen for machining conditions in the NC machine tool 5 described above. The training evaluation that the training instructor 12 inputs to the training evaluation input unit 11 is whether the work performed by the worker 1 is good or bad. The training instructor 12 inputs the training evaluation to the training evaluation input unit 11 may be a binary input such as pass or fail, or may be a five-step evaluation by numbers, with an evaluation on a 100-point scale. There may be. The condition and the training evaluation input to the training evaluation input unit 11 are stored in the information storage unit 10 in association with the current sensor signal CSS.
 図13は、図11に示す情報保存部10に保存された条件および訓練評価を作業者毎に表示装置82に表示した画面の一例を示す図である。図13では、作業者1としての3人の作業者が作業を行った際に、訓練指導者12が訓練評価入力部に入力した条件と訓練評価とが3次元マップに表示されている。図13では、訓練評価が良好である場合が丸印で示され、不良である場合がバツ印で示されている。 FIG. 13 is a diagram illustrating an example of a screen in which the conditions and training evaluation stored in the information storage unit 10 illustrated in FIG. 11 are displayed on the display device 82 for each worker. In FIG. 13, when three workers as the workers 1 perform work, the conditions and training evaluations that the training instructor 12 inputs to the training evaluation input unit are displayed on the three-dimensional map. In FIG. 13, a case where the training evaluation is good is indicated by a circle, and a case where the training evaluation is poor is indicated by a cross.
 本実施の形態によれば、作業者1は、条件および訓練評価を認識することにより、直接的な指導を訓練指導者12から受けなくても、作業で改善すべきポイントや自身の作業の短所を知ることが可能となる。 According to the present embodiment, the worker 1 recognizes the conditions and the training evaluation, so that the points to be improved by the work and the disadvantages of the work of the worker 1 can be obtained without receiving direct guidance from the training leader 12. It becomes possible to know.
 本実施の形態によれば、たとえば、作業者1が、訓練指導者12とは異なる言語のみしか話すことができない外国人労働者であり、作業者1と訓練指導者12とが、言語による十分な意思疎通ができない場合であっても、作業者1に対して訓練評価を認識させることが可能となる。 According to the present embodiment, for example, the worker 1 is a foreign worker who can speak only a language different from the training instructor 12, and the worker 1 and the training instructor 12 have sufficient language. Even if it is not possible to communicate smoothly, it is possible to make the worker 1 recognize the training evaluation.
実施の形態5.
 次に、本発明の実施の形態5にかかる作業支援装置について説明する。図14は、本発明の実施の形態5にかかる作業支援装置の構成を説明するための図である。本発明の実施の形態5にかかる作業支援装置100eは、出力方式設定部7とノウハウ入力部13とノウハウ保存部14とを備える点が、上述した実施の形態4と主に異なる。実施の形態2および実施の形態4と重複した構成および作用については説明を省略し、以下に異なる構成および作用についての説明を行う。
Embodiment 5. FIG.
Next, a work support apparatus according to Embodiment 5 of the present invention will be described. FIG. 14 is a diagram for explaining the configuration of the work support apparatus according to the fifth embodiment of the present invention. The work support apparatus 100e according to the fifth embodiment of the present invention is mainly different from the above-described fourth embodiment in that the output method setting unit 7, the know-how input unit 13, and the know-how storage unit 14 are provided. The description of the same configuration and operation as those of the second and fourth embodiments will be omitted, and a description of the different configuration and operation will be given below.
 図14に示す作業支援装置100eは、訓練指導者12の保有する文書化することが難しい技能の作業者1への伝承に使用される。訓練指導者12は、第1の作業者に対応する。作業者1は、第2の作業者に対応する。作業支援装置100eは、出力方式設定部7と、ノウハウ入力部13と、ノウハウ保存部14とを備える。訓練指導者12は、ノウハウ入力部13に、ノウハウ情報を入力する。ノウハウ情報とは、参照すべきセンサの種類、使用するアルゴリズム、知覚しやすい感覚発生装置の種類、良好な結果が得られる特徴量の傾向が例示される。ノウハウ入力部13は、第2の入力部に対応する。ノウハウ保存部14は、ノウハウ入力部13に入力されたノウハウ情報を保存する。ノウハウ保存部14は、第2の保存部に対応する。 The work support apparatus 100e shown in FIG. 14 is used for handing down the skills of the training instructor 12 that are difficult to document to the worker 1. The training instructor 12 corresponds to the first worker. Worker 1 corresponds to a second worker. The work support apparatus 100e includes an output method setting unit 7, a know-how input unit 13, and a know-how storage unit 14. The training instructor 12 inputs know-how information to the know-how input unit 13. The know-how information is exemplified by the type of sensor to be referred to, the algorithm to be used, the type of sensation generating device that is easy to perceive, and the tendency of the feature value that provides a good result. The know-how input unit 13 corresponds to the second input unit. The know-how storage unit 14 stores the know-how information input to the know-how input unit 13. The know-how storage unit 14 corresponds to a second storage unit.
 図15は、図14に示す作業支援装置100eを使用した作業訓練の手順の一例を示したフローチャートである。まず、訓練指導者12が作業支援装置100eを使用し、作業のデモンストレーションを行う(ステップS1)。次いで、訓練指導者12が自身の作業のノウハウ情報をノウハウ入力部13に入力する(ステップS2)。ノウハウ入力部13に入力されたノウハウ情報は、ノウハウ保存部14に保存され、訓練指導者12による作業中のセンサ信号SSは、過去の作業において取得したセンサ情報として、情報保存部10に保存される。次いで、作業者1が作業支援装置100eを使用し、訓練指導者12が行った作業と同一の作業を行う(ステップS3)。作業者1による作業中のセンサ信号は、現在の作業において取得したセンサ情報として、情報保存部10に保存される。作業者1には、訓練指導者12が作業を行った際のセンサ信号SSから抽出された特徴量と、作業者1が自身の作業を行った際のセンサ信号SSから抽出された特徴量とが伝達される。作業者1は、ノウハウ保存部14に保存されているノウハウ情報を参照して、出力方式を変更する(ステップS4)。目標技能を習得していないときは(ステップS5でNo)、ステップS3およびステップS4の手順を繰り返し、目標技能を習得したときは(ステップS5でYes)、作業訓練を終了する。 FIG. 15 is a flowchart showing an example of a procedure for work training using the work support apparatus 100e shown in FIG. First, the training instructor 12 performs a work demonstration using the work support device 100e (step S1). Next, the training instructor 12 inputs his / her know-how information into the know-how input unit 13 (step S2). The know-how information input to the know-how input unit 13 is stored in the know-how storage unit 14, and the sensor signal SS that is being worked on by the training instructor 12 is stored in the information storage unit 10 as sensor information acquired in the past work. The Next, the worker 1 uses the work support device 100e to perform the same work as the work performed by the training instructor 12 (step S3). The sensor signal during work by the worker 1 is stored in the information storage unit 10 as sensor information acquired in the current work. The worker 1 includes a feature amount extracted from the sensor signal SS when the training instructor 12 performs the work, and a feature amount extracted from the sensor signal SS when the worker 1 performs his / her work. Is transmitted. The worker 1 refers to the know-how information stored in the know-how storage unit 14 and changes the output method (step S4). When the target skill is not mastered (No in step S5), the procedures of step S3 and step S4 are repeated. When the target skill is mastered (Yes in step S5), the work training is terminated.
 本実施の形態によれば、訓練指導者12が作業を行った際のセンサ信号SSから抽出された特徴量と、作業者1が自身の作業を行った際のセンサ信号SSから抽出された特徴量とを比較することができるため、作業者1は訓練指導者12から直接的な指導を受けなくても、作業者1へ技能の伝承が可能となる。これにより、たとえば、後継者の居ない熟練作業者が自身の技能のセンサ情報とノウハウ情報とを保存しておくことにより、熟練作業者が高齢化によって技能が衰えた後または没後において後継者が現れた際に技能の伝承が可能となる。 According to the present embodiment, the feature amount extracted from the sensor signal SS when the training instructor 12 performs work, and the feature extracted from the sensor signal SS when the worker 1 performs his / her work. Since the quantity can be compared, the operator 1 can transfer the skill to the operator 1 without receiving direct instruction from the training instructor 12. Thus, for example, a skilled worker who does not have a successor saves sensor information and know-how information of his / her skill, so that the successor can be found after the skilled worker has diminished or died due to aging. Skills can be handed down when they appear.
実施の形態6.
 次に、本発明の実施の形態6にかかる作業支援装置について説明する。図16は、本発明の実施の形態6にかかる作業支援装置の構成を説明するための図である。本発明の実施の形態6にかかる作業支援装置100fは、AI(Artificial Intelligence)解析部15を備える点が、上述した実施の形態5と主に異なる。実施の形態5と重複した構成および作用については説明を省略し、以下に異なる構成および作用についての説明を行う。
Embodiment 6 FIG.
Next, a work support apparatus according to Embodiment 6 of the present invention will be described. FIG. 16 is a diagram for explaining the configuration of the work support apparatus according to the sixth embodiment of the present invention. The work support apparatus 100f according to the sixth embodiment of the present invention is mainly different from the above-described fifth embodiment in that an AI (Artificial Intelligence) analysis unit 15 is provided. The description of the same configuration and operation as those in the fifth embodiment will be omitted, and a description of the different configuration and operation will be given below.
 図16に示す作業支援装置100fは、AI解析部15を備える。AI解析部15は、情報保存部10に保存されたセンサ信号PSSと条件および評価結果とからノウハウ情報を抽出し、抽出したノウハウ情報をノウハウ保存部14に保存する。AI解析部15での解析に使用されるアルゴリズムは、教師あり学習であり、センサ信号PSSと条件および評価結果とから訓練セットを作成し、学習を行う。AI解析部15では、たとえば、ニューラルネットワークやQ学習のアルゴリズムを使用すればよい。 16 includes an AI analysis unit 15. The AI analysis unit 15 extracts know-how information from the sensor signal PSS stored in the information storage unit 10 and the conditions and evaluation results, and stores the extracted know-how information in the know-how storage unit 14. The algorithm used for the analysis in the AI analysis unit 15 is supervised learning. A training set is created from the sensor signal PSS, the conditions, and the evaluation results, and learning is performed. The AI analysis unit 15 may use, for example, a neural network or a Q learning algorithm.
 本実施の形態によれば、AI解析部15が、訓練指導者12および作業者1が認識していない暗黙知を抽出して作業者1に伝えることが可能となる。 According to the present embodiment, the AI analysis unit 15 can extract tacit knowledge that is not recognized by the training instructor 12 and the worker 1 and transmit it to the worker 1.
実施の形態7.
 次に、本発明の実施の形態7にかかる作業支援装置について説明する。図17は、本発明の実施の形態7にかかる作業支援装置の構成を説明するための図である。本発明の実施の形態7にかかる作業支援装置100gは、理論値設定部16と理論値計算部17とを備える点が、上述した実施の形態1と主に異なる。実施の形態1と重複した構成および作用については説明を省略し、以下に異なる構成および作用についての説明を行う。
Embodiment 7 FIG.
Next, a work support apparatus according to Embodiment 7 of the present invention will be described. FIG. 17 is a diagram for explaining the configuration of the work support apparatus according to the seventh embodiment of the present invention. The work support device 100g according to the seventh embodiment of the present invention is mainly different from the above-described first embodiment in that the theoretical value setting unit 16 and the theoretical value calculation unit 17 are provided. The description of the same configuration and operation as those in the first embodiment will be omitted, and a description of the different configuration and operation will be given below.
 図17に示す作業支援装置100gは、理論値設定部16と、理論値計算部17とを備える。作業者1は、理論値設定部16に、作業環境102の物理量PQ1の理論値、および作業対象物103の物理量PQ2の理論値を設定する。理論値設定部16は、第2の設定部に対応する。理論値計算部17は、理論値設定部16に設定された理論値から理論的に計算されるセンサ信号TSSを生成し、センサ信号TSSをセンサ情報変換部3に出力する。理論値計算部17は、計算部に対応する。 17 includes a theoretical value setting unit 16 and a theoretical value calculation unit 17. The worker 1 sets the theoretical value of the physical quantity PQ1 of the work environment 102 and the theoretical value of the physical quantity PQ2 of the work object 103 in the theoretical value setting unit 16. The theoretical value setting unit 16 corresponds to a second setting unit. The theoretical value calculation unit 17 generates a sensor signal TSS that is theoretically calculated from the theoretical value set in the theoretical value setting unit 16, and outputs the sensor signal TSS to the sensor information conversion unit 3. The theoretical value calculation unit 17 corresponds to the calculation unit.
 センサ情報変換部3は、センサ信号取得部2が取得したセンサ信号SSを、演算などにより、人間の感覚器官で感受可能な特徴量FVaに変換する。センサ情報変換部3は、理論値計算部17から入力されたセンサ信号TSSを、演算などにより、人間の感覚器官で感受可能な特徴量FVbに変換する。感覚情報出力部4は、センサ情報変換部3が変換した特徴量FVaを含む感覚情報SIaと特徴量FVbを含む感覚情報SIbとを、感覚発生装置群104へそれぞれ出力する。感覚発生装置群104は、特徴量FVaと特徴量FVbとを作業者1の感覚器官に伝達する。 The sensor information conversion unit 3 converts the sensor signal SS acquired by the sensor signal acquisition unit 2 into a feature amount FVa that can be sensed by a human sensory organ by calculation or the like. The sensor information conversion unit 3 converts the sensor signal TSS input from the theoretical value calculation unit 17 into a feature amount FVb that can be sensed by a human sensory organ by calculation or the like. The sensory information output unit 4 outputs sensory information SIa including the feature amount FVa and sensory information SIb including the feature amount FVb converted by the sensor information conversion unit 3 to the sensory generation device group 104, respectively. The sensation generating device group 104 transmits the feature value FVa and the feature value FVb to the sensory organ of the worker 1.
 本実施の形態によれば、作業者1は、実際のセンサ信号SSから抽出された特徴量FVaと、理論値から理論的に計算されるセンサ信号TSSから抽出された特徴量FVbとを比較することができ、作業者1に理論値から理論的に計算することができない暗黙知を習得させることが可能となる。 According to the present embodiment, the worker 1 compares the feature quantity FVa extracted from the actual sensor signal SS with the feature quantity FVb extracted from the sensor signal TSS theoretically calculated from the theoretical value. This makes it possible for the worker 1 to acquire tacit knowledge that cannot be calculated theoretically from theoretical values.
 次に、上述した各実施の形態における作業支援装置100a,100b,100c,100d,100e,100f,100gのハードウェア構成について説明する。図18は、上述した各実施の形態における作業支援装置100a,100b,100c,100d,100e,100f,100g(以下、これらを「作業支援装置100」という)のハードウェア構成の一例を示す図である。 Next, the hardware configuration of the work support devices 100a, 100b, 100c, 100d, 100e, 100f, and 100g in the above-described embodiments will be described. FIG. 18 is a diagram illustrating an example of a hardware configuration of the work support devices 100a, 100b, 100c, 100d, 100e, 100f, and 100g (hereinafter referred to as “work support device 100”) in each of the above-described embodiments. is there.
 作業支援装置100は、作業支援装置100の外部からの情報が入力される入力回路および情報を作業支援装置100の外部へ出力する出力回路を含む入出力インタフェース回路201と、プロセッサ202と、メモリ203とを備える。入出力インタフェース回路201は、外部から受信した情報をメモリ203に送る。メモリ203は、入出力インタフェース回路201から受け取った情報を記憶する。また、メモリ203にはコンピュータプログラムが記憶されている。プロセッサ202は、メモリ203に記憶されているコンピュータプログラムを読み出し、メモリ203に記憶されている情報に基づいて演算処理を行う。プロセッサ202による演算結果を示す演算結果情報は、メモリ203に送られる。入出力インタフェース回路201は、メモリ203に記憶されている情報を外部に送る。 The work support apparatus 100 includes an input / output interface circuit 201 including an input circuit for inputting information from the outside of the work support apparatus 100 and an output circuit for outputting the information to the outside of the work support apparatus 100, a processor 202, and a memory 203. With. The input / output interface circuit 201 sends information received from the outside to the memory 203. The memory 203 stores information received from the input / output interface circuit 201. The memory 203 stores a computer program. The processor 202 reads a computer program stored in the memory 203 and performs arithmetic processing based on information stored in the memory 203. Calculation result information indicating the calculation result by the processor 202 is sent to the memory 203. The input / output interface circuit 201 sends information stored in the memory 203 to the outside.
 以上の実施の形態に示した構成は、本発明の内容の一例を示すものであり、別の公知の技術と組み合わせることも可能であるし、本発明の要旨を逸脱しない範囲で、構成の一部を省略および変更することも可能である。 The configuration described in the above embodiment shows an example of the contents of the present invention, and can be combined with another known technique, and can be combined with other configurations without departing from the gist of the present invention. It is also possible to omit and change the part.
 1 作業者、2 センサ信号取得部、3 センサ情報変換部、4 感覚情報出力部、5 NC工作機械、6 加工ワーク、7 出力方式設定部、8 ロボット、9 対象ワーク、10 情報保存部、11 訓練評価入力部、12 訓練指導者、13 ノウハウ入力部、14 ノウハウ保存部、15 AI解析部、16 理論値設定部、17 理論値計算部、20x,20y,20z 送り軸機構、21x,21y,21z 回転モータ、22x,22y,22z 送りねじ、23x,23y,23z 回転角検出器、24 テーブル、25 コラム、26 ラム、27 主軸、31 感覚情報設定部、32 アルゴリズム保存部、33 変換演算実行部、51 ロボットハンド、52 ロボットアーム、53 減速機付きモータ、71 加速度センサ、72 力センサ、73 ロボットコントローラ、74 力覚センサ、81,81a 振動発生装置、82,82a 表示装置、83,83a,83b 音発生装置、84,84a 力覚発生装置、100,100a,100b,100c,100d,100e,100f,100g 作業支援装置、101,101a,101b センサ群、102 作業環境、103 作業対象物、104,104a,104b 感覚発生装置群、201 入出力インタフェース回路、202 プロセッサ、203 メモリ。 1 worker, 2 sensor signal acquisition unit, 3 sensor information conversion unit, 4 sensory information output unit, 5 NC machine tool, 6 machining workpiece, 7 output method setting unit, 8 robot, 9 target workpiece, 10 information storage unit, 11 Training evaluation input unit, 12 training instructor, 13 know-how input unit, 14 know-how storage unit, 15 AI analysis unit, 16 theoretical value setting unit, 17 theoretical value calculation unit, 20x, 20y, 20z feed axis mechanism, 21x, 21y, 21z rotation motor, 22x, 22y, 22z feed screw, 23x, 23y, 23z rotation angle detector, 24 tables, 25 columns, 26 rams, 27 spindles, 31 sensory information setting unit, 32 algorithm storage unit, 33 conversion calculation execution unit , 51 Robot hand, 52 Robot arm, 53 Motor with speed reducer, 71 Degree sensor, 72 force sensor, 73 robot controller, 74 force sensor, 81, 81a vibration generator, 82, 82a display device, 83, 83a, 83b sound generator, 84, 84a force sensor, 100, 100a, 100b, 100c, 100d, 100e, 100f, 100g Work support device, 101, 101a, 101b sensor group, 102 work environment, 103 work object, 104, 104a, 104b sensory generation device group, 201 input / output interface circuit, 202 processor 203 memory.

Claims (9)

  1.  作業環境または作業対象物の状態量を測定するセンサのセンサ信号を取得する取得部と、
     前記センサ信号を作業者の感覚器官で感受可能な特徴量に変換する変換部と、
     前記特徴量を前記作業者の感覚器官が感受可能なように出力する出力部とを備える
     ことを特徴とする作業支援装置。
    An acquisition unit that acquires a sensor signal of a sensor that measures a state quantity of a work environment or a work object;
    A converter that converts the sensor signal into a feature quantity that can be sensed by a sensory organ of an operator;
    An operation support apparatus comprising: an output unit that outputs the feature amount so that the operator's sensory organ can be sensed.
  2.  前記特徴量の出力方式が設定される第1の設定部を備え、
     前記変換部は、前記第1の設定部から出力される設定指令に基づいて、前記特徴量の演算式を選択し、選択した演算式で前記特徴量への変換を行う
     ことを特徴とする請求項1に記載の作業支援装置。
    A first setting unit configured to set an output method of the feature quantity;
    The conversion unit selects an arithmetic expression of the feature amount based on a setting command output from the first setting unit, and performs conversion to the feature amount using the selected arithmetic expression. Item 4. The work support device according to Item 1.
  3.  前記センサのセンサ信号であって、現在の作業に関するセンサ信号である第1のセンサ信号と、前記センサのセンサ信号であって、過去の作業に関するセンサ信号である第2のセンサ信号とを保存する第1の保存部を備え、
     前記変換部は、前記第1のセンサ信号を第1の特徴量に変換するとともに、前記第2のセンサ信号を第2の特徴量に変換し、
     前記出力部は、前記第1の特徴量と前記第2の特徴量とを出力する
     ことを特徴とする請求項1または2に記載の作業支援装置。
    A first sensor signal that is a sensor signal of the sensor and is a sensor signal related to a current work, and a second sensor signal that is a sensor signal of the sensor and is a sensor signal related to a past work are stored. A first storage unit;
    The conversion unit converts the first sensor signal into a first feature amount, converts the second sensor signal into a second feature amount,
    The work support apparatus according to claim 1, wherein the output unit outputs the first feature amount and the second feature amount.
  4.  前記第2のセンサ信号は、第1の作業者の作業によって取得され、第1のセンサ信号は、第2の作業者の作業によって取得される
     ことを特徴とする請求項3に記載の作業支援装置。
    The work support according to claim 3, wherein the second sensor signal is acquired by work of a first worker, and the first sensor signal is acquired by work of a second worker. apparatus.
  5.  前記第1の保存部は、前記第1の特徴量と前記第2の特徴量とを保存する
     ことを特徴とする請求項3に記載の作業支援装置。
    The work support apparatus according to claim 3, wherein the first storage unit stores the first feature amount and the second feature amount.
  6.  作業の評価が入力される第1の入力部を備え、
     前記第1の保存部は、前記センサのセンサ信号であって、前記評価の対象となった作業に関するセンサ信号である第3のセンサ信号と関連付けして前記評価を保存する
     ことを特徴とする請求項3から5のいずれか1項に記載の作業支援装置。
    A first input unit for inputting a work evaluation;
    The first storage unit stores the evaluation in association with a third sensor signal which is a sensor signal of the sensor and which is a sensor signal related to the work subjected to the evaluation. Item 6. The work support device according to any one of Items 3 to 5.
  7.  ノウハウ情報が入力される第2の入力部を備え、
     前記ノウハウ情報を保存する第2の保存部を備える
     ことを特徴とする請求項1から6のいずれか1項に記載の作業支援装置。
    A second input unit for inputting know-how information;
    The work support apparatus according to any one of claims 1 to 6, further comprising a second storage unit that stores the know-how information.
  8.  前記第1の保存部に保存されている前記第3のセンサ信号と前記評価とに基づいてノウハウ情報を抽出するArtificial Intelligence解析部を備える
     ことを特徴とする請求項6に記載の作業支援装置。
    The work support apparatus according to claim 6, further comprising an Artificial Intelligence analysis unit that extracts know-how information based on the third sensor signal stored in the first storage unit and the evaluation.
  9.  前記状態量の理論値が設定される第2の設定部と、
     前記理論値から理論的に計算される第4のセンサ信号を生成する計算部とを備える
     ことを特徴とする請求項1から8のいずれか1項に記載の作業支援装置。
    A second setting unit in which a theoretical value of the state quantity is set;
    The work support apparatus according to claim 1, further comprising: a calculation unit that generates a fourth sensor signal that is theoretically calculated from the theoretical value.
PCT/JP2018/017060 2018-04-26 2018-04-26 Operation assistance device WO2019207732A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020515406A JP7128267B2 (en) 2018-04-26 2018-04-26 Work support device
PCT/JP2018/017060 WO2019207732A1 (en) 2018-04-26 2018-04-26 Operation assistance device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/017060 WO2019207732A1 (en) 2018-04-26 2018-04-26 Operation assistance device

Publications (1)

Publication Number Publication Date
WO2019207732A1 true WO2019207732A1 (en) 2019-10-31

Family

ID=68293886

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/017060 WO2019207732A1 (en) 2018-04-26 2018-04-26 Operation assistance device

Country Status (2)

Country Link
JP (1) JP7128267B2 (en)
WO (1) WO2019207732A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05135063A (en) * 1991-06-10 1993-06-01 Fujitsu Ltd Parts evaluation support system
JP2004240264A (en) * 2003-02-07 2004-08-26 Mitsubishi Electric Corp Bodily sensation type training system
JP2014078134A (en) * 2012-10-10 2014-05-01 Nippon Telegr & Teleph Corp <Ntt> Sensor information high-speed data processing system and sensor information high-speed processing/display system
WO2017159562A1 (en) * 2016-03-14 2017-09-21 オムロン株式会社 Action information generation device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4419384B2 (en) * 2002-11-28 2010-02-24 株式会社安川電機 State quantity presentation device and method
JP4351725B2 (en) * 2007-07-23 2009-10-28 新日本製鐵株式会社 Operation support device, operation support system, and computer program
JP6231362B2 (en) * 2013-11-25 2017-11-15 アズビル株式会社 Plant monitoring server and plant monitoring method
JP2015225364A (en) * 2014-05-26 2015-12-14 富士通株式会社 Manufacturing method and manufacturing management program
JP6776616B2 (en) * 2016-05-23 2020-10-28 富士ゼロックス株式会社 Work guidance device and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05135063A (en) * 1991-06-10 1993-06-01 Fujitsu Ltd Parts evaluation support system
JP2004240264A (en) * 2003-02-07 2004-08-26 Mitsubishi Electric Corp Bodily sensation type training system
JP2014078134A (en) * 2012-10-10 2014-05-01 Nippon Telegr & Teleph Corp <Ntt> Sensor information high-speed data processing system and sensor information high-speed processing/display system
WO2017159562A1 (en) * 2016-03-14 2017-09-21 オムロン株式会社 Action information generation device

Also Published As

Publication number Publication date
JP7128267B2 (en) 2022-08-30
JPWO2019207732A1 (en) 2020-12-03

Similar Documents

Publication Publication Date Title
EP3342565B1 (en) Robot system
US8376753B2 (en) Method and device for learning and training dental treatment techniques
Gupta et al. Master–slave control of a teleoperated anthropomorphic robotic arm with gripping force sensing
EP2055446A1 (en) A portable robot control apparatus and a method for controlling a movement of a robot
JP6504541B2 (en) Control device and control method for master-slave robot, master-slave robot, control program for master-slave robot, and integrated electronic circuit for control of master-slave robot
Wang et al. Design and rapid construction of a cost-effective virtual haptic device
Scalera et al. An experimental setup to test dual-joystick directional responses to vibrotactile stimuli
Miądlicki et al. Real-time gesture control of a CNC machine tool with the use Microsoft Kinect sensor
JP2006340480A (en) Control device of motor and control method
JP2018192601A (en) Robot system and robot system control method
Savur et al. HRC-SoS: Human robot collaboration experimentation platform as system of systems
Song et al. Integrated voluntary-reactive control of a human-superlimb hybrid system for hemiplegic patient support
JP6625266B1 (en) Robot controller
Pascher et al. In Time and Space: Towards Usable Adaptive Control for Assistive Robotic Arms
JP2004178222A (en) Method for evaluating assemblability and assemblability evaluation supporting device using the method
WO2019207732A1 (en) Operation assistance device
Heisnam et al. 20 DOF robotic hand for tele-operation:—Design, simulation, control and accuracy test with leap motion
Tugal et al. Hand-impedance measurements with robots during laparoscopy training
Wu et al. Effects of velocity on human force control
JPH07308321A (en) Apparatus for processing graphic display article and operation simulation apparatus and rotating force generating apparatus to be used therefor
Suzuki et al. Vibrotactile information for supporting pick and place task using industrial robot remote operation
Hashizume et al. Pose presentation of end effector using vibrotactile interface for assistance in motion sharing of industrial robot remote operation
KR20180069288A (en) System for controlling manipulator using wearable device
Fernandes et al. A wearable interface for intuitive control of robotic manipulators without user training
JP7185749B2 (en) ROBOT SYSTEM AND ROBOT SYSTEM CONTROL METHOD

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18916918

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020515406

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18916918

Country of ref document: EP

Kind code of ref document: A1