WO2021250971A1 - Dispositif de détermination de préhension, procédé de détermination de préhension et programme de détermination de préhension - Google Patents

Dispositif de détermination de préhension, procédé de détermination de préhension et programme de détermination de préhension Download PDF

Info

Publication number
WO2021250971A1
WO2021250971A1 PCT/JP2021/011421 JP2021011421W WO2021250971A1 WO 2021250971 A1 WO2021250971 A1 WO 2021250971A1 JP 2021011421 W JP2021011421 W JP 2021011421W WO 2021250971 A1 WO2021250971 A1 WO 2021250971A1
Authority
WO
WIPO (PCT)
Prior art keywords
grip
series data
data
time
determination
Prior art date
Application number
PCT/JP2021/011421
Other languages
English (en)
Japanese (ja)
Inventor
智貴 西出
Original Assignee
株式会社村田製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社村田製作所 filed Critical 株式会社村田製作所
Priority to DE112021000332.1T priority Critical patent/DE112021000332T5/de
Priority to JP2022530034A priority patent/JP7355242B2/ja
Publication of WO2021250971A1 publication Critical patent/WO2021250971A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01LMEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
    • G01L1/00Measuring force or stress, in general
    • G01L1/16Measuring force or stress, in general using properties of piezoelectric devices

Definitions

  • the present invention relates to a grip determination device, a grip determination method, and a grip determination program.
  • Patent Document 1 proposes a measurement method for calculating a standard deviation of a measured value of a piezoelectric sensor and determining whether or not to output a measured value of the piezoelectric sensor based on the calculated standard deviation. .. According to the measurement method proposed in Patent Document 1, when the object actually vibrates, the measurement by the piezoelectric sensor can be started, whereby the piezoelectricity while the object is not vibrated. It is possible to suppress unnecessary measurement of the sensor.
  • Patent Document 1 has the following problems. That is, according to the standard deviation of the signal value of the piezoelectric element, it is possible to determine whether or not vibration is generated in the object to which the piezoelectric element is attached. However, since the standard deviation only indicates how the data is scattered, it is possible to grasp a simple event such as applying a force to the object based on the calculated standard deviation value. It is difficult to accurately grasp the behavior of an object. In particular, it is difficult to accurately determine whether or not the behavior with respect to the object is a grip (for example, discriminating between a noise-like behavior such as hitting periodically and a grip) based on the standard deviation value. ..
  • the present invention has been made in view of such circumstances on one aspect, and an object thereof is a technique for accurately determining whether or not an action with respect to an object is a grip by using a piezoelectric element. To provide.
  • the grip determination device is based on a data acquisition unit that acquires time-series data indicating the measured values of the piezoelectric elements attached to the object in time series, and the acquired time-series data.
  • a data acquisition unit that acquires time-series data indicating the measured values of the piezoelectric elements attached to the object in time series
  • the acquired time-series data One or more of the above calculated using a calculation unit that calculates one or more first feature quantities related to the shape of the waveform of the measured values arranged in time series and a trained determination model generated by machine learning.
  • a determination unit for determining whether or not the action with respect to the object is a grip, and an output unit for outputting the determination result are provided.
  • the present inventor uses the feature amount related to the shape of the waveform of the measured value that can be calculated from the time series data of the piezoelectric element as the explanatory variable, and adopts machine learning as the method of generating the judgment model. By doing so, it was found that it is possible to generate a judgment model that can judge with high accuracy whether or not the action on the object is a grip. Therefore, according to the configuration, by providing the trained determination model so generated, it is possible to accurately determine whether or not the action with respect to the object is a grip by using the piezoelectric element.
  • the time-series data may include first-series data configured by arranging signal values directly obtained from the piezoelectric element in time series. According to this configuration, it is possible to accurately and easily determine whether or not the action with respect to the object is a grip by using the piezoelectric element.
  • the time series data is generated by standardizing the original series data composed of the signal values directly obtained from the piezoelectric element arranged in time series. Two series of data may be included. If an attempt is made to discriminate noise such as knock as a grip based on the average value of the measured values, there is a possibility that noise having the same intensity as the grip will be erroneously determined as the grip. Similarly, when the trained determination model is so configured, erroneous determinations due to the average of the measured values can occur. According to this configuration, the average value of time series data can be made constant by standardization. As a result, the possibility of erroneous determination due to the average value of the measured values can be reduced, and improvement in determination accuracy can be expected.
  • the time-series data is calculated by calculating the difference sequence of the original series data configured by arranging the signal values directly obtained from the piezoelectric element in the time series. It may include the generated third series data.
  • the fluctuation (noise) of the baseline in the signal of the piezoelectric element can be removed by the step difference calculation. That is, the baseline of the signal can be unified to some extent. This can be expected to improve the determination accuracy.
  • the time-series data calculates a difference sequence of the original series data configured by arranging the signal values directly obtained from the piezoelectric element in the time-series. It may include the 4th series data generated by standardization. According to this configuration, it can be expected that the determination accuracy will be improved by the above-mentioned actions of the difference calculation and standardization.
  • the one or more first feature quantities may be composed of a predetermined percentile value, skewness, kurtosis, or a combination thereof. According to this configuration, it is possible to appropriately obtain a feature amount in which the feature of the waveform of the measured value appears, and thereby it is possible to accurately determine whether or not the action with respect to the object is a grip.
  • the time length of the time series data may be 400 ms or more. According to this configuration, by setting the time length of the time series data to 400 ms or more, it can be expected that the accuracy of the judgment by the trained judgment model will be improved when the noise of 4 Hz to 30 Hz and the grip are discriminated. ..
  • determining whether or not the action with respect to the object is a grip is to determine whether the action with respect to the object is a grip or a noise action other than that. May include. According to this configuration, it is possible to accurately discriminate between grip behavior and other noise-like behavior.
  • the calculation unit may further calculate one or more second feature quantities related to the amplitude from the acquired time series data. Then, the determination unit may further determine the strength of the grip with respect to the object based on the calculated one or more second feature quantities using the learned determination model. From the results of the experimental examples described later, the present inventor can accurately determine the strength of the grip by using the feature amount related to the amplitude of the measured value that can be calculated from the time series data of the piezoelectric element as the explanatory variable. I found more. Therefore, according to the configuration, when the action with respect to the object is a grip, the strength of the grip can be accurately determined.
  • the one or more second feature quantities may be composed of a minimum value, a maximum value, a standard deviation, or a combination thereof. According to this configuration, it is possible to appropriately obtain a feature amount in which the characteristic of the amplitude of the measured value appears, whereby the strength of the grip can be accurately determined.
  • one aspect of the present invention may be an information processing method that realizes all or a part of each of the above components, or may be a program.
  • it may be a storage medium that stores such a program and can be read by a computer or other device, a machine, or the like.
  • the storage medium that can be read by a computer or the like is a medium that stores information such as a program by electrical, magnetic, optical, mechanical, or chemical action.
  • the computer acquires time-series data indicating the measured values of the piezoelectric elements attached to the object in time series, and the acquired time-series data.
  • the one or more features may be composed of a predetermined percentile value, skewness, kurtosis, or a combination thereof.
  • the grip determination program includes a step of acquiring time-series data indicating the measured values of the piezoelectric elements attached to the object in a time-series manner, and the acquired time-series.
  • the present invention it is possible to determine whether the action with respect to the object is a grip or other noise by using the piezoelectric element.
  • FIG. 1 schematically illustrates an example of a situation in which the present invention is applied.
  • FIG. 2 schematically illustrates an example of the hardware configuration of the grip determination device according to the embodiment.
  • FIG. 3 schematically illustrates an example of the hardware configuration of the model generator according to the embodiment.
  • FIG. 4 schematically illustrates an example of the software configuration of the grip determination device according to the embodiment.
  • FIG. 5 schematically illustrates an example of the software configuration of the model generator according to the embodiment.
  • FIG. 6 is a flowchart showing an example of a processing procedure of the model generator according to the embodiment.
  • FIG. 7 is a flowchart showing an example of the processing procedure of the grip determination device according to the embodiment.
  • FIG. 8A shows a sample of measurement data obtained in non-contact.
  • FIG. 8B shows a sample of measurement data obtained during grip.
  • FIG. 8C shows a sample of measurement data obtained during noise (knock).
  • FIG. 9A shows a sample histogram of the original series data obtained at the time of non-contact.
  • FIG. 9B shows a sample histogram of the difference series data obtained at the time of non-contact.
  • FIG. 9C shows a sample histogram of the standardized sequence data obtained at non-contact.
  • FIG. 9D shows a sample histogram of standardized grade difference series data obtained at non-contact.
  • FIG. 10A shows a sample histogram of the original series data obtained at the time of gripping.
  • FIG. 10B shows a sample histogram of the difference series data obtained at the time of gripping.
  • FIG. 10A shows a sample histogram of the original series data obtained at the time of gripping.
  • FIG. 10C shows a sample histogram of the standardized sequence data obtained during grip.
  • FIG. 10D shows a sample histogram of the standardized difference sequence data obtained during grip.
  • FIG. 11A shows a sample histogram of the original series data obtained at the time of noise (knock).
  • FIG. 11B shows a sample histogram of the difference series data obtained at the time of noise.
  • FIG. 11C shows a sample histogram of the standardized sequence data obtained during noise.
  • FIG. 11D shows a sample histogram of the standardized difference sequence data obtained during noise.
  • FIG. 12 is a graph showing the discrimination accuracy of the trained determination model according to the experimental example.
  • FIG. 13A is a graph showing the results of calculating 25% values from the original series data obtained during non-contact, grip, and noise.
  • FIG. 13A is a graph showing the results of calculating 25% values from the original series data obtained during non-contact, grip, and noise.
  • FIG. 13B is a graph showing the results of calculating 75% values from the original series data obtained during non-contact, grip, and noise.
  • FIG. 13C is a graph showing the result of calculating the skewness from the original series data obtained in the case of non-contact, grip, and noise.
  • FIG. 13D is a graph showing the results of calculating the kurtosis from the original series data obtained during non-contact, grip, and noise.
  • FIG. 13E is a graph showing the result of calculating the minimum value from the original series data obtained in the case of non-contact, grip, and noise.
  • FIG. 13F is a graph showing the result of calculating the maximum value from the original series data obtained in the case of non-contact, grip, and noise.
  • FIG. 13G is a graph showing the results of calculating the standard deviation from the original series data obtained during non-contact, grip, and noise.
  • FIG. 13H is a graph showing the result of calculating a 50% value from the original series data obtained in the case of non-contact, grip, and noise.
  • FIG. 14A is a graph showing the result of calculating the 25% value from the difference series data obtained in the case of non-contact, grip, and noise.
  • FIG. 14B is a graph showing the result of calculating a 75% value from the difference series data obtained in the case of non-contact, grip, and noise.
  • FIG. 14C is a graph showing the result of calculating the skewness from the difference series data obtained in the case of non-contact, grip, and noise.
  • FIG. 14A is a graph showing the result of calculating the 25% value from the difference series data obtained in the case of non-contact, grip, and noise.
  • FIG. 14B is a graph showing the result of calculating a 75% value from the difference series data
  • FIG. 14D is a graph showing the results of calculating the kurtosis from the difference series data obtained during non-contact, grip, and noise.
  • FIG. 14E is a graph showing the result of calculating the minimum value from the difference series data obtained in the case of non-contact, grip, and noise.
  • FIG. 14F is a graph showing the result of calculating the maximum value from the difference series data obtained in the case of non-contact, grip, and noise.
  • FIG. 14G is a graph showing the results of calculating the standard deviation from the difference series data obtained during non-contact, grip, and noise.
  • FIG. 14H is a graph showing the result of calculating the 50% value from the difference series data obtained in the case of non-contact, grip, and noise.
  • FIG. 15A is a graph showing the results of calculating 25% values from standardized sequence data obtained for non-contact, grip, and noise.
  • FIG. 15B is a graph showing the results of calculating 75% values from standardized sequence data obtained for non-contact, grip, and noise.
  • FIG. 15C is a graph showing the result of calculating the skewness from the standardized sequence data obtained in the case of non-contact, grip, and noise.
  • FIG. 15D is a graph showing the results of calculating kurtosis from standardized sequence data obtained during non-contact, grip, and noise.
  • FIG. 15E is a graph showing the results of calculating the minimum value from the standardized sequence data obtained during non-contact, grip, and noise.
  • FIG. 15F is a graph showing the result of calculating the maximum value from the standardized series data obtained in the case of non-contact, grip, and noise.
  • FIG. 15G is a graph showing the results of calculating 50% values from standardized sequence data obtained during non-contact, grip, and noise.
  • FIG. 16A is a graph showing the results of calculating 25% values from standardized grade difference series data obtained during non-contact, grip, and noise.
  • FIG. 16B is a graph showing the results of calculating 75% values from standardized grade difference series data obtained during non-contact, grip, and noise.
  • FIG. 16C is a graph showing the result of calculating the skewness from the standardized difference series data obtained in the case of non-contact, grip, and noise.
  • FIG. 16D is a graph showing the results of calculating the kurtosis from the standardized difference sequence data obtained during non-contact, grip, and noise.
  • FIG. 16E is a graph showing the result of calculating the minimum value from the standardized difference series data obtained in the case of non-contact, grip, and noise.
  • FIG. 16F is a graph showing the result of calculating the maximum value from the standardized difference series data obtained in the case of non-contact, grip, and noise.
  • FIG. 16G is a graph showing the result of calculating the 50% value from the standardized difference series data obtained in the case of non-contact, grip, and noise.
  • FIG. 17A is a graph showing the results of calculating the minimum value from the original series data obtained in the case of non-contact, weak grip, medium grip, and strong grip.
  • FIG. 17B is a graph showing the results of calculating the maximum value from the original series data obtained in the case of non-contact, weak grip, medium grip, and strong grip.
  • FIG. 17C is a graph showing the results of calculating the standard deviation from the original series data obtained for non-contact, weak grip, medium grip, and strong grip.
  • FIG. 17D is a graph showing the results of calculating 25% values from the original series data obtained in the case of non-contact, weak grip, medium grip, and strong grip.
  • FIG. 17E is a graph showing the results of calculating 75% values from the original series data obtained in the case of non-contact, weak grip, medium grip, and strong grip.
  • FIG. 18A is a graph showing the result of calculating the minimum value from the difference series data obtained in the case of non-contact, weak grip, medium grip, and strong grip.
  • FIG. 18B is a graph showing the result of calculating the maximum value from the difference series data obtained in the case of non-contact, weak grip, medium grip, and strong grip.
  • FIG. 18C is a graph showing the result of calculating the standard deviation from the difference series data obtained in the case of non-contact, weak grip, medium grip, and strong grip.
  • FIG. 18D is a graph showing the result of calculating the 25% value from the difference series data obtained in the case of non-contact, weak grip, medium grip, and strong grip.
  • FIG. 18E is a graph showing the result of calculating a 75% value from the difference series data obtained in the case of non-contact, weak grip, medium grip, and strong grip.
  • the present embodiment an embodiment according to one aspect of the present invention (hereinafter, also referred to as “the present embodiment”) will be described with reference to the drawings.
  • the embodiments described below are merely examples of the present invention in all respects. Needless to say, various improvements and modifications can be made without departing from the scope of the present invention. That is, in carrying out the present invention, a specific configuration according to the embodiment may be appropriately adopted.
  • the data appearing in the present embodiment are described in natural language, but more specifically, they are specified in a pseudo language, a command, a parameter, a machine language, etc. that can be recognized by a computer.
  • FIG. 1 schematically illustrates an example of an application situation of the determination system 100 according to the present embodiment.
  • the determination system 100 includes a grip determination device 1 and a model generation device 2.
  • the grip determination device 1 uses the learned determination model 5 generated by machine learning to determine the grip behavior (holding the object) with respect to the object based on the measurement data of the piezoelectric element S. It is a computer configured to do so. First, the grip determination device 1 acquires time-series data 6 showing a plurality of measured values of the piezoelectric element S attached to the object in a time-series manner. Next, the grip determination device 1 calculates one or more feature quantities 71 related to the shape of the waveform of the measured values arranged in the time series from the acquired time series data 6. The grip determination device 1 uses the learned determination model 5 generated by machine learning to determine whether or not the action with respect to the object is a grip based on the calculated one or more feature quantities 71. Then, the grip determination device 1 outputs the determination result.
  • the model generation device 2 is a computer configured to generate a trained determination model 5 that can be used for determining the grip.
  • the model generator 2 acquires a plurality of training data sets 3.
  • Each training data set 3 is composed of a combination of training data and a correct answer label.
  • the training data is configured to include data of the same type as one or more features 71.
  • the correct answer label is configured to indicate the correct answer of the action for the object in the training data.
  • the model generation device 2 performs machine learning of the determination model 5 using the acquired plurality of training data sets 3.
  • the learned determination has acquired the ability to determine whether or not the action with respect to the object is a grip based on one or more feature quantities (feature quantity 71) calculated from the time series data of the piezoelectric element S.
  • Model 5 can be generated.
  • the learned determination model 5 generated by machine learning is used to determine whether or not the action for the object is a grip, and the calculation is performed from the time series data 6 of the piezoelectric element S.
  • One or more feature quantities 71 relating to the shape of the waveform of the measured value to be measured are adopted as explanatory variables of the determination model 5. According to the present embodiment, by adopting such a configuration, it is possible to accurately determine whether or not the action with respect to the object is a grip by using the piezoelectric element S.
  • the grip determination device 1 and the model generation device 2 are connected to each other via a network.
  • the type of network may be appropriately selected from, for example, the Internet, a wireless communication network, a mobile communication network, a telephone network, a dedicated network, and the like.
  • the method of exchanging data between the grip determination device 1 and the model generation device 2 does not have to be limited to such an example, and may be appropriately selected depending on the embodiment.
  • data may be exchanged between the grip determination device 1 and the model generation device 2 using a storage medium.
  • the grip determination device 1 and the model generation device 2 are each configured by a separate computer.
  • the configuration of the determination system 100 according to the present embodiment does not have to be limited to such an example, and may be appropriately determined according to the embodiment.
  • the grip determination device 1 and the model generation device 2 may be an integrated computer.
  • at least one of the grip determination device 1 and the model generation device 2 may be configured by a plurality of computers.
  • the type of the object is not particularly limited as long as it can be the object to be gripped, and may be appropriately selected according to the embodiment.
  • the object may be, for example, a grip portion of a vehicle, an in-vehicle switch, a controller of a game device, an electric tool, a household electric appliance, a grip portion of an agricultural device, or the like.
  • the grip portion of the vehicle may be, for example, a handle, a lever, a grip portion of an autonomous driving vehicle, a grip portion of a tractor, or the like.
  • the vehicle may be, for example, a two-wheeled vehicle (for example, a motorcycle, a bicycle), a four-wheeled vehicle (for example, an automobile), or the like.
  • the self-driving vehicle may be, for example, an autonomous driving bus, an autonomous driving taxi, or the like, and the grip portion of the autonomous driving vehicle may be, for example, a leather for safety purposes.
  • the in-vehicle switch may be, for example, a touch pad, a touch panel, an automatic door open / close switch, an engine start switch, or the like.
  • the electric tool may be, for example, an electric saw, a chainsaw, a lawn mower, or the like.
  • the household electric appliance may be, for example, a vacuum cleaner, an iron or the like.
  • FIG. 2 schematically illustrates an example of the hardware configuration of the grip determination device 1 according to the present embodiment.
  • the control unit 11, the storage unit 12, the communication interface 13, the external interface 14, the input device 15, the output device 16, and the drive 17 are electrically connected. It is a computer that has been used.
  • the communication interface and the external interface are described as "communication I / F" and "external I / F". The same notation is used in the following figures.
  • the control unit 11 includes a CPU (Central Processing Unit), RAM (Random Access Memory), ROM (Read Only Memory), etc., which are hardware processors, and is configured to execute information processing based on a program and various data.
  • the storage unit 12 is an example of a memory, and is composed of, for example, a hard disk drive, a solid state drive, or the like. In the present embodiment, the storage unit 12 stores various information such as the grip determination program 81 and the learning result data 225.
  • the grip determination program 81 is a program for causing the grip determination device 1 to perform information processing (FIG. 7) described later regarding determination of a grip on an object.
  • the grip determination program 81 includes a series of instructions for the information processing.
  • the training result data 225 shows information about the trained determination model 5. Details will be described later.
  • the communication interface 13 is, for example, a wired LAN (Local Area Network) module, a wireless LAN module, or the like, and is an interface for performing wired or wireless communication via a network.
  • the grip determination device 1 can execute data communication via a network with another information processing device by using the communication interface 13.
  • the external interface 14 is, for example, a USB (Universal Serial Bus) port, a dedicated port, or the like, and is an interface for connecting to an external device.
  • the type and number of external interfaces 14 may be arbitrarily selected.
  • the grip determination device 1 is directly connected to the piezoelectric element S via the external interface 14.
  • the grip determination device 1 can acquire the measurement data of the piezoelectric element S.
  • the connection method and connection relationship between the grip determination device 1 and the piezoelectric element S are not limited to such an example.
  • the grip determination device 1 may be connected via the communication interface 13. Further, for example, the grip determination device 1 may be indirectly connected to the piezoelectric element S via another computer.
  • the type of the piezoelectric element S is not particularly limited as long as it is configured to change the output by the applied force, and may be appropriately selected according to the embodiment.
  • the piezoelectric element S may be composed of a material having piezoelectricity (piezoelectric body) and electrodes.
  • the piezoelectric element S may be configured by sandwiching the piezoelectric body between two electrodes.
  • the piezoelectric material may be, for example, ceramics, an organic substance, a single crystal, or the like.
  • the measurement frequency of the piezoelectric element S may be arbitrary.
  • the measurement frequency of the piezoelectric element S may be, for example, 60 Hz or higher.
  • the input device 15 is, for example, a device for inputting a mouse, a keyboard, or the like.
  • the output device 16 is, for example, a device for outputting a display, a speaker, or the like. An operator such as a user can operate the grip determination device 1 by using the input device 15 and the output device 16.
  • the drive 17 is, for example, a CD drive, a DVD drive, or the like, and is a drive device for reading various information such as a program stored in the storage medium 91.
  • the storage medium 91 performs electrical, magnetic, optical, mechanical or chemical action on the information of the program or the like so that the computer or other device, the machine or the like can read various information of the stored program or the like. It is a medium that accumulates by. At least one of the grip determination program 81 and the learning result data 225 may be stored in the storage medium 91.
  • the grip determination device 1 may acquire at least one of the grip determination program 81 and the learning result data 225 from the storage medium 91. Note that FIG.
  • the type of the storage medium 91 is not limited to the disc type, and may be other than the disc type. Examples of storage media other than the disk type include semiconductor memories such as flash memories.
  • the type of the drive 17 may be arbitrarily selected according to the type of the storage medium 91.
  • the control unit 11 may include a plurality of hardware processors.
  • the hardware processor may be composed of a microprocessor, FPGA (field-programmable gate array), or the like.
  • the storage unit 12 may be composed of a RAM and a ROM included in the control unit 11. At least one of the communication interface 13, the external interface 14, the input device 15, the output device 16, and the drive 17 may be omitted.
  • the grip determination device 1 may be composed of a plurality of computers. In this case, the hardware configurations of the computers may or may not match. Further, the grip determination device 1 may be a microcomputer, a general-purpose PC (Personal Computer), or the like, in addition to an information processing device designed exclusively for the provided service.
  • FIG. 3 schematically illustrates an example of the hardware configuration of the model generator 2 according to the present embodiment.
  • the control unit 21, the storage unit 22, the communication interface 23, the external interface 24, the input device 25, the output device 26, and the drive 27 are electrically connected. It is a computer that has been used.
  • the control units 21 to drive 27 and the storage medium 92 of the model generation device 2 may be configured in the same manner as the control units 11 to drive 17 and the storage medium 91 of the grip determination device 1, respectively.
  • the control unit 21 includes a CPU, RAM, ROM, etc., which are hardware processors, and is configured to execute various information processing based on programs and data.
  • the storage unit 22 is composed of, for example, a hard disk drive, a solid state drive, or the like. In the present embodiment, the storage unit 22 stores various information such as the model generation program 82, the plurality of learning data sets 3, and the learning result data 225.
  • the model generation program 82 is a program for causing the model generation device 2 to execute information processing (FIG. 6) described later regarding the generation of the trained determination model 5 (machine learning of the determination model 5).
  • the model generation program 82 includes a series of instructions for the information processing.
  • the plurality of training data sets 3 are used to generate the trained determination model 5.
  • the training result data 225 may be generated as a result of executing the model generation program 82.
  • At least one of the model generation program 82 and the plurality of training data sets 3 may be stored in the storage medium 92. Further, the model generation device 2 may acquire at least one of the model generation program 82 and the plurality of learning data sets 3 from the storage medium 92.
  • the control unit 21 may include a plurality of hardware processors.
  • the hardware processor may be composed of a microprocessor, FPGA, or the like.
  • the storage unit 22 may be composed of a RAM and a ROM included in the control unit 21. At least one of the communication interface 23, the external interface 24, the input device 25, the output device 26, and the drive 27 may be omitted.
  • the model generator 2 may be composed of a plurality of computers. In this case, the hardware configurations of the computers may or may not match.
  • the model generation device 2 may be a general-purpose server device, a general-purpose PC, or the like, in addition to an information processing device designed exclusively for the provided service.
  • FIG. 4 schematically illustrates an example of the software configuration of the grip determination device 1 according to the present embodiment.
  • the control unit 11 of the grip determination device 1 expands the grip determination program 81 stored in the storage unit 12 into the RAM. Then, the control unit 11 controls each component by interpreting and executing the instruction included in the grip determination program 81 expanded in the RAM by the CPU.
  • the grip determination device 1 operates as a computer including a data acquisition unit 111, a calculation unit 112, a determination unit 113, and an output unit 114 as software modules.
  • the data acquisition unit 111 acquires time-series data 6 showing a plurality of measured values of the piezoelectric element S attached to the object in time series.
  • the time series data 6 is configured to include four series data (original series data 61, standardized series data 62, difference series data 63, and standardized difference series data 64).
  • the original series data 61 is composed of signal values directly obtained from the piezoelectric element S arranged in chronological order.
  • the original sequence data 61 may be composed of the raw signal value of the piezoelectric element S.
  • the data acquisition unit 111 may acquire the raw data 120 generated by the piezoelectric element S as the original series data 61.
  • the original sequence data 61 may be composed of preprocessed signal values obtained by applying arbitrary preprocessing to the raw signal values of the piezoelectric element S.
  • the data acquisition unit 111 may acquire the original sequence data 61 by acquiring the raw data 120 from the piezoelectric element S and executing preprocessing on the acquired raw data 120.
  • the pretreatment may include noise processing such as removal of a DC component (average value) and removal of high frequency noise by a moving average, for example.
  • the moving average is a time-series data obtained by calculating the average value in a section having a certain time length while shifting the section in the time axis direction. Before calculating the average value in an interval, the data in that interval may be multiplied by an arbitrary window function (for example, a Hanning window).
  • the standardized sequence data 62 is generated by standardizing the original sequence data composed of the signal values directly obtained from the piezoelectric element S arranged in chronological order.
  • the difference sequence data 63 is generated by calculating the difference sequence of the original sequence data configured by arranging the signal values directly obtained from the piezoelectric element S in a time series.
  • the standardized difference sequence data 64 is generated by calculating and standardizing the difference sequence of the original sequence data composed by arranging the signal values directly obtained from the piezoelectric element S in a time series.
  • each original series data may be raw data 120 or preprocessed series data obtained by performing preprocessing on the raw data 120.
  • Standardization is a normalization process that converts the standard deviation and average value of series data into constant values (for example, the standard deviation is converted into 1 and the average value is converted into 0 data). That is, the standardized sequence data 62 is sequence data obtained by executing the normalization process on the original sequence data, and is sequence data in which the standard deviation and the average value are constant values. Equation 1 below is an example of a standardization operation.
  • the difference calculation is an operation to calculate the difference between the value at each time point and the value at a time point one away from that time point in the series data composed of a plurality of values arranged in a time series.
  • the difference series is a series obtained by such a difference operation. That is, the difference sequence data 63 is sequence data obtained by executing the difference operation on the original sequence data, and is sequence data indicating a change within a specific time. Therefore, the influence of the fluctuation of the baseline can be reduced.
  • the standardized difference series data 64 is series data obtained by executing the above-mentioned difference calculation and normalization processing on the original series data. The execution order of the difference calculation and the normalization process when obtaining the standardized difference sequence data 64 may be arbitrary. Equation 2 below is an example of the difference calculation.
  • ⁇ y t indicates the value of the time point t of the difference series.
  • y t indicates the value of the time point t of the original series, and
  • y t-1 indicates the value of the time point t-1 of the original series.
  • ys t indicates the value of the time point t after standardization.
  • mean (y) indicates the average value of the series data before standardization, and std (y) indicates the standard deviation of the series data before standardization.
  • the original series data 61 is an example of the first series data, and directly shows the time series of the measured values of the piezoelectric element S.
  • the standardized series data 62 is an example of the second series data.
  • the difference series data 63 is an example of the third series data.
  • the standardized difference sequence data 64 is an example of the fourth sequence data.
  • the standardized sequence data 62, the difference sequence data 63, and the standardized difference sequence data 64 each indirectly indicate the time series of the measured values of the piezoelectric element S. That is, the measured values constituting the time-series data 6 are the raw signal values obtained from the piezoelectric element S and the signal values after applying preprocessing such as noise removal to the raw signal values (preprocessed signal values).
  • the data acquisition unit 111 acquires the raw data 120 from the piezoelectric element S and generates the above-mentioned series data 61 to 64 from the acquired raw data 120.
  • the data acquisition unit 111 can acquire the time series data 6 including the original series data 61, the standardized series data 62, the difference series data 63, and the standardized difference series data 64.
  • the calculation unit 112 calculates one or more feature quantities 71 related to the shape of the waveform of the measured values arranged in the time series from the acquired time series data 6.
  • the feature amount 71 is an example of the first feature amount.
  • the feature amount 71 relating to the shape of the waveform is a type of feature amount whose values may differ depending on the shape of the waveform.
  • the feature amount 71 may be, for example, a predetermined percentile value, skewness, kurtosis, minimum value, maximum value, or the like.
  • the predetermined percentile value indicates a predetermined third measured value after arranging each measured value in order from the smaller one (or the larger one), assuming that the number of measured values constituting the series data is 100.
  • the 50th percentile value is the median.
  • the ordinal value (predetermined value) for obtaining the percentile value may be arbitrarily selected. In order to obtain a value that reflects the shape of the waveform well, the value of this ordinal number is preferably 10 to 40 or 60 to 90.
  • the skewness is an index showing the skewness of the data distribution. Kurtosis is an indicator of the sharpness of the data distribution. Each value may be calculated by a known calculation method.
  • the determination unit 113 includes the learned determination model 5 generated by machine learning by holding the learning result data 225.
  • the determination unit 113 uses the learned determination model 5 to determine whether or not the action with respect to the object is a grip based on the calculated one or more feature quantities 71. Determining whether an action on an object is a grip may include determining whether the action on the object is a grip or other noise behavior.
  • the noise behavior may be an action of exerting a force on the object, for example, by knocking the object temporarily or periodically.
  • one or more feature quantities 71 are the predetermined percentile value, skewness, and the like in the above example. It is preferably composed of sharpness or a combination thereof.
  • the calculation unit 112 further calculates one or more feature quantities 75 related to the amplitude from the acquired time series data 6.
  • the feature amount 75 is an example of the second feature amount.
  • the feature amount 75 relating to the amplitude is a type of feature amount in which the values may differ if the amplitude (magnitude of the measured value) is different.
  • the feature amount 75 may be, for example, a minimum value, a maximum value, a standard deviation, a predetermined percentile value, or the like.
  • the determination unit 113 further determines the strength of the grip with respect to the object based on the calculated one or more feature quantities 75 using the learned determination model 5.
  • one or more feature quantities 75 are the minimum value, the maximum value, the standard deviation, and the standard deviation in the above example. Alternatively, it is preferably composed of a combination thereof.
  • the output unit 114 outputs the result of each determination.
  • FIG. 5 schematically illustrates an example of the software configuration of the model generation device 2 according to the present embodiment.
  • the control unit 21 of the model generation device 2 expands the model generation program 82 stored in the storage unit 22 into the RAM. Then, the control unit 21 controls each component by interpreting and executing the instruction included in the model generation program 82 expanded in the RAM by the CPU.
  • the model generation device 2 according to the present embodiment operates as a computer including a data acquisition unit 211, a learning processing unit 212, and a storage processing unit 213 as software modules.
  • the data collection unit 211 collects a plurality of learning data sets 3 for use in machine learning.
  • Each training data set 3 is composed of a combination of training data 31 and a correct answer label 32.
  • the training data 31 is configured to include data of the same type as one or more feature quantities 71.
  • the correct answer label 32 is configured to indicate the correct answer (whether or not it is a grip) of the action for the object in the training data 31.
  • the grip determination device 1 determines whether the action with respect to the object is a grip or a noise action other than that, the correct answer label 32 is configured to indicate the correct answer of this determination process.
  • the trained determination model 5 is configured to further determine the strength of the grip. Therefore, in the present embodiment, the training data 31 is configured to further include data of the same type as one or more feature quantities 75. Further, the correct answer label 32 is configured to further indicate the correct answer of the grip strength in the training data 31.
  • the data collecting unit 211 may acquire the measurement data of the time interval in which the behavior with respect to the object is specified from the piezoelectric element S or the same type of piezoelectric element as the raw learning data 220. Subsequently, the data collecting unit 211 may generate the learning time series data 221 from the learning raw data 220 by the same method as the data acquisition unit 111.
  • the learning raw data 220 corresponds to the raw data 120
  • the learning time-series data 221 corresponds to the time-series data 6.
  • the data collecting unit 211 may generate the training data 31 of each learning data set 3 from the learning time series data 221 by executing the same arithmetic processing as the calculation unit 112. Then, the data collecting unit 211 associates the information indicating the specified action (whether or not the grip / the strength of the grip) with the corresponding training data 31 as the correct answer label 32. As a result, each training data set 3 can be generated.
  • the learning processing unit 212 uses a plurality of learning data sets 3 to perform machine learning of the determination model 5.
  • the configuration of the determination model 5 is not particularly limited and may be appropriately selected depending on the embodiment.
  • the machine learning method may be appropriately selected according to the configuration of the determination model 5.
  • As the configuration of the determination model 5 and the method of machine learning a known configuration and method may be adopted.
  • As a result of carrying out this machine learning based on the feature amount (feature amount 71 / feature amount 75) calculated from the time series data of the piezoelectric element S, whether or not the action with respect to the object is a grip and the strength of the grip are determined. It is possible to generate a trained determination model 5 that has acquired the ability to determine.
  • the storage processing unit 213 generates the result of machine learning, that is, the generated information about the learned determination model 5 as the learning result data 225.
  • the training result data 225 is configured to include information for reproducing the learned determination model 5. Then, the storage processing unit 213 stores the generated learning result data 225 in a predetermined storage area.
  • each software module of the grip determination device 1 and the model generation device 2 will be described in detail in an operation example described later.
  • an example in which each software module of the grip determination device 1 and the model generation device 2 is realized by a general-purpose CPU is described.
  • some or all of the software modules may be implemented by one or more dedicated processors. That is, each of the above modules may be realized as a hardware module.
  • software modules may be omitted, replaced, or added as appropriate according to the embodiment.
  • the learning result data 225 is not always generated separately from the learned determination model 5, and may include a case where only the learned determination model reflecting the learning result data 225 is generated.
  • FIG. 6 is a flowchart showing an example of a processing procedure related to machine learning of the determination model 5 by the model generation device 2 according to the present embodiment.
  • the processing procedure of the model generator 2 is only an example, and each step may be changed as much as possible. Further, regarding the following processing procedure of the model generation device 2, it is possible to omit, replace, and add steps as appropriate according to the embodiment.
  • Step S201 the control unit 21 operates as the data collection unit 211 and collects a plurality of learning data sets 3.
  • Each learning data set 3 may be generated as appropriate.
  • a piezoelectric element S or a piezoelectric element of the same type is prepared. Attach the prepared piezoelectric element to the object that is supposed to be judged. Then, various actions including grip are performed on the object. As a result, raw data 220 for learning is generated. Subsequently, the learning time series data 221 is generated from the learning raw data 220.
  • the learning time series data 221 is configured to include the original series data, the standardized series data, the difference series data, and the standardized difference series data. Next, a feature amount of the same type as the feature amount 71 is calculated from the generated raw learning data 220.
  • a feature amount of the same type as the feature amount 75 is calculated from the raw learning data 220.
  • the calculated feature amount can be acquired as the training data 31.
  • the information indicating the action (whether or not the grip / the strength of the grip) performed on the object is associated with the corresponding training data 31 as the correct answer label 32.
  • Each learning data set 3 may be automatically generated by the operation of a computer, or may be manually generated by at least partially including an operator operation. Further, the generation of each training data set 3 may be performed by the model generation device 2 or may be performed by a computer other than the model generation device 2.
  • the control unit 21 collects each learning data set 3 by automatically or manually executing the generation process by the operation of the operator.
  • the control unit 21 collects each learning data set 3 via, for example, a network, a storage medium 92, or the like.
  • a model generation device 2 may generate a part of the training data set 3, and one or a plurality of other computers may generate another training data set 3.
  • the number of learning data sets 3 to be collected is not particularly limited and may be appropriately determined according to the embodiment.
  • the control unit 21 proceeds to the next step S202.
  • Step S202 the control unit 21 operates as the learning processing unit 212, and uses the obtained plurality of learning data sets 3 to perform machine learning of the determination model 5.
  • the determination model 5 includes one or more arithmetic parameters for deriving the inference result.
  • the inference result of the determination model 5 based on the target data is obtained as an output from the determination model 5 by inputting the target data into the determination model 5 and executing the arithmetic processing of the determination model 5.
  • the output format of the determination model 5 is not particularly limited and may be appropriately determined according to the embodiment.
  • Machine learning trains the determination model 5 for each training data set 3 so that the inference result of the determination model 5 based on the training data 31 matches the corresponding correct answer label 32 (that is, adjusts the values of the arithmetic parameters). It is composed of.
  • a machine learning model such as a decision tree model, a neural network, a regression model, or a support vector machine
  • the machine learning method adjustment method of arithmetic parameters
  • a known method such as a random forest, an error backpropagation method, or a regression analysis
  • the determination model 5 may be configured by a decision tree model.
  • the threshold value of the conditional branch is an example of the calculation parameter.
  • the control unit 21 adjusts the values of the calculation parameters of the decision tree model by the random forest method as a machine learning process.
  • the determination model 5 may be configured by a neural network. In this case, the weight of the connection between each neuron, the threshold value of each neuron, and the like are examples of arithmetic parameters.
  • the control unit 21 adjusts the value of the calculation parameter by the error back propagation method as a machine learning process.
  • Step S203 the control unit 21 operates as the storage processing unit 213 and generates learning result data 225 indicating the generated learned determination model 5.
  • the configuration of the learning result data 225 is not particularly limited as long as the information for executing the inference (determination) operation can be retained, and may be appropriately determined according to the embodiment.
  • the learning result data 225 may be composed of the configuration of the determination model 5 (for example, the structure of the neural network, etc.) and the information indicating the values of the arithmetic parameters obtained by the above adjustment. Then, the control unit 21 stores the generated learning result data 225 in a predetermined storage area.
  • the predetermined storage area may be, for example, a RAM in the control unit 21, a storage unit 22, an external storage device, a storage medium, or a combination thereof.
  • the storage medium may be, for example, a CD, a DVD, or the like, and the control unit 21 may store the learning result data 225 in the storage medium via the drive 27.
  • the external storage device may be, for example, a data server such as NAS (Network Attached Storage). In this case, the control unit 21 may store the learning result data 225 in the data server via the network by using the communication interface 23. Further, the external storage device may be, for example, an external storage device connected to the model generation device 2 via the external interface 24.
  • control unit 21 ends the processing procedure related to this operation example.
  • the generated learning result data 225 may be provided to the grip determination device 1 at an arbitrary timing.
  • the control unit 21 may transfer the learning result data 225 to the grip determination device 1 as the process of step S203 or separately from the process.
  • the grip determination device 1 may acquire the learning result data 225 by receiving this. Further, for example, the grip determination device 1 may acquire the learning result data 225 by accessing the model generation device 2 or the data server via the network using the communication interface 13. Further, for example, the grip determination device 1 may acquire the learning result data 225 via the storage medium 91. Further, for example, the learning result data 225 may be incorporated in the grip determination device 1 in advance.
  • control unit 21 may update or newly generate the learning result data 225 by repeating the processes of steps S201 to S203 periodically or irregularly. At the time of this repetition, at least a part of the plurality of training data sets 3 may be changed, modified, added, deleted, or the like as appropriate. Then, the control unit 21 may provide the updated or newly generated learning result data 225 to the grip determination device 1 by any method. As a result, the learning result data 225 held by the grip determination device 1 may be updated.
  • FIG. 7 is a flowchart showing an example of a processing procedure related to grip determination by the grip determination device 1 according to the present embodiment.
  • the following processing procedure of the grip determination device 1 is an example of the grip determination method.
  • the following processing procedure of the grip determination device 1 is only an example, and each step may be changed as much as possible. Further, regarding the following processing procedure of the grip determination device 1, it is possible to omit, replace, and add steps as appropriate according to the embodiment.
  • Step S101 the control unit 11 operates as a data acquisition unit 111 to acquire time-series data 6 showing a plurality of measured values of the piezoelectric element S attached to the object in time series.
  • the control unit 11 acquires time series data 6 including four series data 61 to 64.
  • the control unit 11 may generate preprocessed data by acquiring raw data 120 from the piezoelectric element S and executing preprocessing on the acquired raw data 120.
  • the pretreatment may include a process of removing a DC component (average value) and a process of removing high frequency noise by a moving average.
  • the control unit 11 may acquire the preprocessed data as the original series data 61. Further, the control unit 11 may generate the standardized sequence data 62 by executing the standardization process (calculation of the above equation 1) on the obtained preprocessed data (original sequence data).
  • control unit 11 may generate the difference sequence data 63 by executing the difference calculation (the operation of the above equation 2) on the obtained preprocessed data.
  • the control unit 11 generates the standardized difference sequence data 64 by executing the difference calculation on the preprocessed data and further executing the standardization operation on the series data obtained by the difference calculation. You may.
  • the preprocessing may be omitted in the process of generating each series data 61 to 64. By executing these processes, the control unit 11 can acquire the time series data 6 including the four series data 61 to 64.
  • the generation process of each of the series data 61 to 64 may be executed by another computer. In this case, the control unit 11 may acquire the time series data 6 generated by another computer.
  • the time length of the time series data 6 may be set arbitrarily.
  • the time length of the time series data 6 (that is, each series data 61 to 64) is 400 ms or more in order to discriminate between the noise of 4 Hz to 30 Hz and the grip with high accuracy. May be set.
  • Step S102 the control unit 11 operates as the calculation unit 112, and calculates one or more feature quantities 71 related to the shape of the waveform of the measured values arranged in the time series from the acquired time series data 6.
  • the one or more feature quantities 71 may be composed of a predetermined percentile value, skewness, kurtosis, or a combination thereof.
  • the control unit 11 can obtain one or more feature quantities 71 by calculating at least one of these types of feature quantities from the series data 61 to 64.
  • control unit 11 calculates the calculation unit 112 from the acquired time-series data 6 to calculate one or more feature quantities 75 related to the amplitude.
  • the one or more feature quantities 75 may be composed of a minimum value, a maximum value, a standard deviation, or a combination thereof.
  • the control unit 11 can obtain one or more feature quantities 75 by calculating at least one of these types of feature quantities from the series data 61 to 64.
  • the types and numbers of the feature quantities (71, 75) to be calculated between the series data 61 to 64 may or may not match.
  • the type and number of each feature amount (71, 75) calculated from each series data 61 to 64 may be appropriately selected according to the embodiment. However, for the standardized sequence data 62 and the standardized difference sequence data 64, it is preferable to select a feature amount other than the standard deviation. After calculating each feature amount (71, 75), the control unit 11 proceeds to the next step S103.
  • Step S103 the control unit 11 operates as the determination unit 113, and sets the learned determination model 5 by referring to the learning result data 225. Then, the control unit 11 uses the learned determination model 5 to determine whether or not the action with respect to the object is a grip based on the calculated one or more feature quantities 71. Determining whether an action on an object is a grip may include determining whether the action on the object is a grip or other noise behavior. Further, the control unit 11 calculates the strength of the grip with respect to the object based on the calculated one or more feature quantities 75 by using the learned determination model 5.
  • control unit 11 inputs the calculated one or more feature quantities 71 and the one or more feature quantities 75 into the trained determination model 5, and executes the arithmetic processing of the trained determination model 5. As a result, the control unit 11 acquires the output corresponding to the result of each determination from the learned determination model 5. When the result of each determination is acquired from the learned determination model 5, the control unit 11 proceeds to the next step S104.
  • Step S104 the control unit 11 operates as the output unit 114 and outputs information regarding the result of each determination.
  • the output destination and the content of the information to be output may be appropriately determined according to the embodiment.
  • the control unit 11 may output the result of each determination in step S103 to the output device 16 as it is. Further, as another example, the control unit 11 may execute predetermined information processing based on the result of each determination. The control unit 11 may output the result of executing the information processing as information regarding the result of each determination. For example, the control unit 11 may recognize the on / off (or grip strength) of the grip action based on the result of the determination in step S103. Then, the control unit 11 may execute some action according to the on / off of the grip action.
  • the control unit 11 may continuously and repeatedly execute a series of information processing in steps S101 to S104.
  • the timing of repeating may be arbitrary.
  • the grip determination device 1 may continuously perform a determination task relating to the grip on the object.
  • the control unit 11 may execute a series of information processing in real time. That is, the control unit 11 may execute a series of information processing of steps S101 to S104 for the data acquired in real time from the piezoelectric element S.
  • the control unit 11 may execute a series of information processing in steps S101 to S104 on the data acquired in the past from the piezoelectric element S.
  • FIG. 8B shows a sample of measurement data obtained during grip (strength: medium).
  • FIG. 8C shows a sample of measurement data obtained during noise behavior (knock). In each of these, the time is shown on the horizontal axis, and the value obtained by pretreating the measured value of the piezoelectric element from which the DC component is removed is shown on the vertical axis.
  • the obtained measurement data of each action is divided into data having a time length of 1000 ms, 500 ms, 400 ms, and 300 ms, and time-series data for learning (original sequence data, standardization).
  • Series data, difference series data, and standardized difference series data were generated for each action type and division time length.
  • 9A-9D are histograms of the original sequence data (FIG. 9A), the difference sequence data (FIG. 9B), the standardized sequence data (FIG. 9C), and the standardized difference sequence data (FIG. 9D) obtained at the time of non-contact.
  • a sample is shown.
  • 10A to 10D show the original sequence data (FIG. 10A), the difference sequence data (FIG. 10B), the standardized sequence data (FIG.
  • each feature amount was calculated from the generated time-series data for learning (each series data).
  • 25th percentile value, 50th percentile value (median value), 75th percentile value, minimum value, maximum value, standard deviation, kurtosis, and skewness were selected.
  • 25th percentile value, 50th percentile value (median value), 75th percentile value, minimum value, maximum value, kurtosis, and skewness were selected as the features of the standardized series data and the standardized difference series data.
  • Training data was composed of each calculated feature amount, and a correct answer label indicating each action type was associated with the obtained training data.
  • a plurality of training data sets were generated for each time length of the time series data. That is, about 240 training data sets corresponding to 1000 milliseconds time series data, about 480 training data sets corresponding to 500 milliseconds time series data, and about 600 corresponding to 400 milliseconds time series data.
  • a training data set and about 800 training data sets corresponding to 300 milliseconds time-series data were generated.
  • the prepared handles were used to further acquire measurement data for verification of each behavior.
  • the time length of the acquired measurement data for verification was about 30 seconds each.
  • the acquired verification measurement data is divided into the corresponding time-length data, and about 30 1000-msec time-series data, about 60 500-msec time-series data, and about 75 400-msec.
  • Time-series data and about 100 300-msec time-series data were generated for each action type.
  • FIG. 12 shows the result of calculating the discrimination accuracy of each trained judgment model.
  • the discrimination accuracy was calculated by combining the grip behaviors of three intensities into one (that is, the grip behaviors of three intensities were treated as a single grip behavior).
  • the discrimination accuracy of the trained determination model for each time length exceeded 90%. From this result, it was found that by using each of the above features as an explanatory variable, it is possible to generate a judgment model having the ability to judge with high accuracy whether or not the behavior with respect to the object is a grip by machine learning.
  • the time length of the time-series data is set to 400 ms or more, the accuracy of discrimination between each behavior, particularly noise behavior (knocking of about 1 Hz to 5 Hz) and grip behavior, exceeded 96%. From this result, it was found that the time length of the time series data should be 400 ms or more in order to discriminate between the knock behavior and the grip behavior of 1 Hz or more (particularly 4 Hz to 30 Hz) with high accuracy.
  • the discrimination accuracy of weak grip and strong grip by the trained judgment model of 300 milliseconds was 94.06% and 87.38%.
  • the discrimination accuracy of weak grip and strong grip by the trained judgment model of 400 milliseconds was 90.79% and 89.61%.
  • the discrimination accuracy of weak grip and strong grip by the trained judgment model of 500 milliseconds was 91.67% and 95.08%.
  • the discrimination accuracy of weak grip and strong grip by the trained judgment model of 1000 milliseconds was 96.67% and 100%. From this result, it was found that by using each feature amount as an explanatory variable, it is possible to generate a judgment model having the ability to judge the strength of the grip with high accuracy by machine learning.
  • 13A to 13H show the 25th percentile value (FIG. 13A), the 75th percentile value (FIG. 13B), and the skewness calculated from the verification original series data obtained by the non-contact, grip, and noise behaviors.
  • FIG. 13C shows kurtosis (FIG. 13D), minimum value (FIG. 13E), maximum value (FIG. 13F), standard deviation (FIG. 13G), and 50th percentile value (FIG. 13H) are shown.
  • 14A-14H show the 25th percentile value (FIG. 14A), the 75th percentile value (FIG.
  • FIG. 14C The distribution of degree (FIG. 14C), kurtosis (FIG. 14D), minimum value (FIG. 14E), maximum value (FIG. 14F), standard deviation (FIG. 14G), and 50th percentile value (FIG. 14H) is shown.
  • 15A-15G show the 25th percentile value (FIG. 15A), the 75th percentile value (FIG. 15B), and the skewness calculated from the standardized sequence data for verification obtained by the non-contact, grip, and noise behaviors.
  • FIG. 15C kurtosis
  • FIG. 15E minimum value
  • maximum value FIGG.
  • FIG. 15F shows 50th percentile value
  • FIG. 15G shows 25th percentile values (FIG. 16A), 75th percentile values (FIG. 16B), calculated from standardized skewness sequence data for verification obtained by non-contact, grip, and noise behaviors.
  • the distribution of skewness (FIG. 16C), kurtosis (FIG. 16D), minimum value (FIG. 16E), maximum value (FIG. 16F), and 50th percentile value (FIG. 16G) is shown.
  • the given percentile values are affected by the shape of the waveform of the measured value (ie, if the shape fluctuates).
  • 17A to 17E show the minimum value (FIG. 17A), the maximum value (FIG. 17B), and the standard deviation (FIG. 17C) calculated from the verification original series data obtained by each action of the non-contact and grip of each strength.
  • the 25th percentile value (FIG. 17D) The 25th percentile value (FIG. 17D)
  • the distribution of the 75th percentile value FIG. 17E.
  • 18A-18E show the minimum (FIG. 18A), maximum (FIG. 18B), and standard deviation (FIG. 18C) calculated from the verification original series data obtained by each action of non-contact and grip of each strength.
  • the 25th percentile value (FIG. 18D) The 25th percentile value (FIG. 18D), and the distribution of the 75th percentile value (FIG. 18E).
  • the minimum value, maximum value and standard deviation are examples of features that are affected by the amplitude of the measured value (that is, the values can change if the amplitude fluctuates).
  • the distribution of minimum, maximum and standard deviation was correlated with grip strength. From this result, it was found that the feature amount related to the amplitude of the measured value is effective for determining the strength of the grip.
  • the distributions of the 25th percentile value and the 75th percentile value calculated from the original series data and the difference series data also had a correlation with the grip strength. From this result, it was found that a predetermined percentile value can also be used to determine the strength of the grip as a feature amount related to the amplitude of the measured value.
  • step S103 the trained determination model 5 generated by machine learning is used, and one relating to the shape of the waveform of the measured value calculated from the time series data 6 of the piezoelectric element S.
  • the above feature amount 71 as an explanatory variable, it is determined whether or not the action with respect to the object is a grip.
  • the present embodiment by adopting such a configuration, as shown in the above experimental example, it is possible to accurately determine whether or not the action with respect to the object is a grip by using the piezoelectric element S. Can be done.
  • step S103 one or more feature quantities 75 relating to the amplitude calculated from the time series data 6 of the piezoelectric element S using the trained determination model 5 generated by machine learning will be described. Adopted as a variable to determine the strength of the grip against the object. According to the present embodiment, by adopting such a configuration, the strength of the grip with respect to the object can be accurately determined by using the piezoelectric element S as shown in the above experimental example.
  • the time series data at the time of learning and the time of inference include standardized series data and standardized difference series data. According to the standardization operation, the average value of the time series data can be set to a constant value. As a result, the possibility of erroneous judgment due to the average value of the measured values can be reduced, and the accuracy of judgment of whether or not the action on the object is grip (particularly, judgment of grip or noise behavior) is improved. Can be expected.
  • the time series data at the time of learning and the time of inference include the difference series data and the standardized difference series data. According to the difference calculation, it is possible to remove the fluctuation (noise) of the baseline in the signal of the piezoelectric element S. That is, the baseline of the signal can be unified to some extent. As a result, improvement in determination accuracy in step S103 can be expected.
  • the time series data at the time of learning and at the time of inference are configured to include four series data of original series data, standardized series data, difference series data, and standardized difference series data.
  • the structure of the time series data does not have to be limited to such an example.
  • the time series data at least one of the original series data, the standardized series data, the difference series data, and the standardized difference series data may be omitted.
  • the trained determination model 5 is configured to perform two determinations of whether or not the action with respect to the object is a grip and the strength of the grip.
  • the configuration of the trained determination model 5 does not have to be limited to such an example. One of the two determinations may be omitted.
  • the trained determination model 5 When the trained determination model 5 is configured to have only the ability to determine whether the action on the object is a grip, the data related to the acquisition of the ability to determine the strength of the grip in the above machine learning phase is training data. It may be omitted from 31 and the correct answer label 32. In addition, in the inference (determination) phase, the process of calculating the feature amount 75 and the process of determining the grip strength may be omitted.
  • the trained determination model 5 when configured to have only the ability to determine the strength of the grip, the data related to the acquisition of the ability to determine whether or not the action on the object is the grip in the above machine learning phase is obtained. It may be omitted from the training data 31 and the correct answer label 32. In addition, in the inference phase, the process of calculating the feature amount 71 and the process of determining the behavior with respect to the object may be omitted.
  • the trained determination model 5 is configured to have only the ability to determine the strength of the grip, it is preferable that the time series data is configured to include at least one of the original series data and the difference series data.
  • the strength of the grip is set in three stages (weak, medium, and strong).
  • the scale of grip strength does not have to be limited to such examples.
  • the strength of the grip may be estimated at a continuous value or at any number of levels.
  • one trained determination model 5 is configured to have the ability to execute the above two determinations.
  • the number of trained determination models does not have to be limited to such examples.
  • a separate trained determination model may be prepared for each determination process.
  • the obtained time-series data was determined using the time-series data divided into lengths of 400 ms or more.
  • the data having a length of 400 ms or more are duplicated while shifting the cutting position by 100 ms. You may acquire and judge while doing so.
  • Preservation processing unit 220 ... Raw data for learning, 221 ... Time-series data for learning, 225 ... Learning result data, 3 ... Learning data set, 31 ... Training data, 32 ... Correct label, 5 ... Judgment model, 6 ... Time series data, 61 ... Original series data (first series data), 62 ... Standardized series data (second series data), 63 ... Difference series data (third series data), 64 ... Standardized difference series data (4th series data), 71 ... Feature amount (first feature amount), 75 ... Feature amount (second feature amount)

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Measurement Of Mechanical Vibrations Or Ultrasonic Waves (AREA)

Abstract

Un dispositif de détermination de préhension (1) selon un aspect de cette invention : acquiert des données de série chronologique auprès d'un élément piézoélectrique (S) fixé à un objet ; calcule, à partir des données de série chronologique, au moins une valeur caractéristique relative à la forme de la forme d'onde des valeurs mesurées alignées dans la série chronologique ; et utilise un modèle de détermination formé produit par apprentissage automatique pour déterminer, en fonction desdites valeurs caractéristiques calculées, si une action effectuée sur l'objet est une préhension.
PCT/JP2021/011421 2020-06-12 2021-03-19 Dispositif de détermination de préhension, procédé de détermination de préhension et programme de détermination de préhension WO2021250971A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE112021000332.1T DE112021000332T5 (de) 2020-06-12 2021-03-19 Greifbeurteilungsvorrichtung, Greifbeurteilungsmethode und Greifbeurteilungsprogramm
JP2022530034A JP7355242B2 (ja) 2020-06-12 2021-03-19 グリップ判定装置、グリップ判定方法及びグリップ判定プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020102181 2020-06-12
JP2020-102181 2020-06-12

Publications (1)

Publication Number Publication Date
WO2021250971A1 true WO2021250971A1 (fr) 2021-12-16

Family

ID=78845543

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/011421 WO2021250971A1 (fr) 2020-06-12 2021-03-19 Dispositif de détermination de préhension, procédé de détermination de préhension et programme de détermination de préhension

Country Status (3)

Country Link
JP (1) JP7355242B2 (fr)
DE (1) DE112021000332T5 (fr)
WO (1) WO2021250971A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116306937A (zh) * 2023-03-22 2023-06-23 中航信移动科技有限公司 一种基于时间序列离线数据的规则提取方法、介质及设备

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004237022A (ja) * 2002-12-11 2004-08-26 Sony Corp 情報処理装置および方法、プログラム、並びに記録媒体
WO2019017099A1 (fr) * 2017-07-20 2019-01-24 ソニー株式会社 Appareil électronique, dispositif de traitement d'informations et procédé de traitement d'informations

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7001219B2 (ja) 2017-12-22 2022-01-19 三井化学株式会社 振動計測装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004237022A (ja) * 2002-12-11 2004-08-26 Sony Corp 情報処理装置および方法、プログラム、並びに記録媒体
WO2019017099A1 (fr) * 2017-07-20 2019-01-24 ソニー株式会社 Appareil électronique, dispositif de traitement d'informations et procédé de traitement d'informations

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116306937A (zh) * 2023-03-22 2023-06-23 中航信移动科技有限公司 一种基于时间序列离线数据的规则提取方法、介质及设备
CN116306937B (zh) * 2023-03-22 2023-11-10 中航信移动科技有限公司 一种基于时间序列离线数据的规则提取方法、介质及设备

Also Published As

Publication number Publication date
JP7355242B2 (ja) 2023-10-03
JPWO2021250971A1 (fr) 2021-12-16
DE112021000332T5 (de) 2022-09-22

Similar Documents

Publication Publication Date Title
CN109397703B (zh) 一种故障检测方法及装置
CN109946055B (zh) 一种汽车座椅滑轨异响检测方法及系统
US20060161391A1 (en) Knowledge-forming apparatus and parameter-retrieving method as well as program product
CN102265227A (zh) 用于在机器状况监视中创建状态估计模型的方法和设备
EP2409638A1 (fr) Procédé de surveillance de paramètre biologique, programme d'ordinateur et dispositif de surveillance de paramètre biologique
Das et al. Essential steps in prognostic health management
Akhavian et al. Coupling human activity recognition and wearable sensors for data-driven construction simulation.
CN102355848A (zh) 生物学参数的监视方法及监视装置、计算机程序
WO2021250971A1 (fr) Dispositif de détermination de préhension, procédé de détermination de préhension et programme de détermination de préhension
JP2020531929A (ja) テレメトリデータおよびウェアラブルセンサデータを用いたレーシングデータ分析のためのシステムおよび方法
CN102355849A (zh) 生物学参数的监视方法及监视装置、计算机程序
US10980463B2 (en) Driver's tension level determining apparatus and driver's tension level determining method
WO2008035611A1 (fr) Dispositif de traitement de données, procédé de traitement de données et programme de traitement de données
CN111252075A (zh) 驾驶激烈程度评价系统、方法、存储介质及车辆
CN116361191A (zh) 一种基于人工智能的软件兼容性处理方法
JP2002268742A (ja) 加工診断監視システム、加工診断監視装置、および加工診断監視プログラム
JP2005216202A (ja) 未来値予測装置および未来値予測方法
CN109743200B (zh) 基于资源特征的云计算平台计算任务成本预测方法及系统
CN111730604B (zh) 基于人体肌电信号的机械夹爪控制方法、装置和电子设备
Firpi et al. Genetically programmed-based artificial features extraction applied to fault detection
JP4265474B2 (ja) 異音発生源特定装置及び異音発生源特定方法
JP2020139914A (ja) 物質構造分析装置、方法及びプログラム
JP2020107248A (ja) 異常判定装置および異常判定方法
Navea et al. Design and implementation of an acoustic-based car engine fault diagnostic system in the android platform
JP6989841B2 (ja) 教師情報付学習データ生成方法、機械学習方法、教師情報付学習データ生成システム及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21822621

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022530034

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 21822621

Country of ref document: EP

Kind code of ref document: A1