TWI761834B - Intelligent method for testing sensed data and system thereof - Google Patents
Intelligent method for testing sensed data and system thereof Download PDFInfo
- Publication number
- TWI761834B TWI761834B TW109115966A TW109115966A TWI761834B TW I761834 B TWI761834 B TW I761834B TW 109115966 A TW109115966 A TW 109115966A TW 109115966 A TW109115966 A TW 109115966A TW I761834 B TWI761834 B TW I761834B
- Authority
- TW
- Taiwan
- Prior art keywords
- data
- model
- detected
- sensing data
- parameter
- Prior art date
Links
Images
Landscapes
- Testing And Monitoring For Control Systems (AREA)
Abstract
Description
說明書公開一種檢測感應數據的方法,特別是指利用深度學習法與機器學習法從感測數據建立檢測模型的一種感測數據智能檢測方法與系統。The specification discloses a method for detecting sensing data, in particular a method and system for intelligently detecting sensing data by using deep learning and machine learning to establish a detection model from the sensing data.
當檢測一個產品、一個系統或一個場域是否有異常時,常見的方式是利用特定感測器感測出數據,如感測聲音、拍攝影像等,之後以分析工具分析出感測數據中的資訊,藉此判斷待測對象是否有異常狀況。When detecting whether a product, a system or a field is abnormal, a common method is to use a specific sensor to sense data, such as sensing sound, shooting images, etc., and then use analysis tools to analyze the sensed data. information, thereby judging whether the object to be tested has abnormal conditions.
舉例來說,當要判斷一個物體表面是否有瑕疵,可以照相機拍攝物體表面,可比對物體表面影像與樣本影像,可判斷出是否有異常的情況。若以馬達為例,一般的方式是以聲音感測器錄製馬達運轉時產生的音訊,經比對聲音樣板後可判斷是否運作有異常,作為日後改良的依據。For example, when it is necessary to determine whether the surface of an object is flawed, a camera can shoot the surface of the object, and the image of the surface of the object can be compared with the sample image to determine whether there is any abnormality. Taking a motor as an example, the general method is to use a sound sensor to record the audio generated when the motor is running. After comparing the sound model, it can be judged whether there is any abnormality in operation, which can be used as a basis for future improvement.
說明書公開一種感測數據智能檢測方法與系統,方法運行於一電腦系統中,電腦系統設有一記憶體,其中儲存執行感測數據智能檢測方法的程式集以及演算法,電腦系統還設有一數據庫,其中儲存自一待檢測對象所取得的感測數據。The specification discloses a method and system for intelligent detection of sensing data. The method runs in a computer system. The computer system is provided with a memory, which stores a program set and an algorithm for executing the intelligent detection method of sensing data. The computer system also has a database. The sensing data obtained from an object to be detected is stored therein.
在一實施例中,感測數據為通過感測器感測待檢測對象得出的數據,在感測數據智能檢測方法中,先將感測數據區分為訓練數據與測試數據,對其中的訓練數據執行一深度學習演算法,從該訓練數據中取得關於待檢測對象的特徵,建立用以檢測待檢測對象的一檢測模型,接著以其中的測試數據測試此檢測模型,針對測試結果為不通過的測試數據,執行機器學習演算法,並以歷史參數數據訓練檢測模型,以一模型優化演算法優化檢測模型,以得出經優化待檢測對象的控制參數。In one embodiment, the sensing data is the data obtained by sensing the object to be detected by the sensor. In the intelligent detection method of the sensing data, the sensing data is firstly divided into training data and test data, and the training data is A deep learning algorithm is executed on the data, the features about the object to be detected are obtained from the training data, a detection model for detecting the object to be detected is established, and then the detection model is tested with the test data therein, and the test result is not passed. The test data is obtained, the machine learning algorithm is executed, the detection model is trained with the historical parameter data, and the detection model is optimized by a model optimization algorithm, so as to obtain the optimized control parameters of the object to be detected.
進一步地,還從訓練數據中得出驗證數據,於上述步驟中從訓練數據經深度學習演算法形成檢測模型時,可以此驗證數據驗證檢測模型,以得出用以產生控制參數的一參數模型。所述控制參數為驅動待檢測對象運作的參數,產生經過優化的控制參數即為所述方法的主要目的之一。Further, the verification data is also obtained from the training data. When the detection model is formed from the training data through the deep learning algorithm in the above steps, the verification data can be used to verify the detection model to obtain a parameter model for generating control parameters. . The control parameters are parameters that drive the operation of the object to be detected, and generating optimized control parameters is one of the main purposes of the method.
進一步地,可再以模型優化演算法優化參數模型,再產生經優化的控制參數,並繼續驅動待檢測對象,反覆以上步驟,再次得出感測數據,通過重複感測數據智能檢測方法優化待檢測對象的控制參數。Further, the model optimization algorithm can be used to optimize the parameter model, and then the optimized control parameters can be generated, and continue to drive the object to be detected, and the above steps are repeated to obtain the sensing data again. Detects the control parameters of the object.
進一步地,在一實施例中,可以一K折交叉驗證法利用驗證數據評估從多個深度學習演算法與機器學習法產生的多個模型,評估的因子例如為從多個深度學習演算法產生的該多個模型中得出各模型的準確度、精確度與召回率,再根據評分從中選出檢測模型。Further, in one embodiment, a K-fold cross-validation method can be used to evaluate multiple models generated from multiple deep learning algorithms and machine learning methods using validation data, and the evaluated factors are, for example, generated from multiple deep learning algorithms. The accuracy, precision and recall rate of each model are obtained from the multiple models, and then the detection model is selected according to the score.
為使能更進一步瞭解本發明的特徵及技術內容,請參閱以下有關本發明的詳細說明與圖式,然而所提供的圖式僅用於提供參考與說明,並非用來對本發明加以限制。For a further understanding of the features and technical content of the present invention, please refer to the following detailed descriptions and drawings of the present invention. However, the drawings provided are only for reference and description, and are not intended to limit the present invention.
以下是通過特定的具體實施例來說明本發明的實施方式,本領域技術人員可由本說明書所公開的內容瞭解本發明的優點與效果。本發明可通過其他不同的具體實施例加以施行或應用,本說明書中的各項細節也可基於不同觀點與應用,在不悖離本發明的構思下進行各種修改與變更。另外,本發明的附圖僅為簡單示意說明,並非依實際尺寸的描繪,事先聲明。以下的實施方式將進一步詳細說明本發明的相關技術內容,但所公開的內容並非用以限制本發明的保護範圍。The following are specific embodiments to illustrate the embodiments of the present invention, and those skilled in the art can understand the advantages and effects of the present invention from the content disclosed in this specification. The present invention can be implemented or applied through other different specific embodiments, and various details in this specification can also be modified and changed based on different viewpoints and applications without departing from the concept of the present invention. In addition, the drawings of the present invention are merely schematic illustrations, and are not drawn according to the actual size, and are stated in advance. The following embodiments will further describe the related technical contents of the present invention in detail, but the disclosed contents are not intended to limit the protection scope of the present invention.
應當可以理解的是,雖然本文中可能會使用到“第一”、“第二”、“第三”等術語來描述各種元件或者信號,但這些元件或者信號不應受這些術語的限制。這些術語主要是用以區分一元件與另一元件,或者一信號與另一信號。另外,本文中所使用的術語“或”,應視實際情況可能包括相關聯的列出項目中的任一個或者多個的組合。It should be understood that although terms such as "first", "second" and "third" may be used herein to describe various elements or signals, these elements or signals should not be limited by these terms. These terms are primarily used to distinguish one element from another element, or a signal from another signal. In addition, the term "or", as used herein, should include any one or a combination of more of the associated listed items, as the case may be.
揭露書公開一種感測數據智能檢測方法與系統,方法的主要目的之一是利用深度學習演算法初期訓練與建立模型,以及配合機器學習演算法執行後期測試與訓練,從感測數據中訓練得出用以檢測特定待檢測對象的檢測模型。舉例來說,待檢測對象可為一個產品、一個系統或一個場域,利用感測器感測待檢測對象的特定資訊,如影像、聲音、震動,得出的感測數據如拍攝待檢測對象得出的影像、錄製待檢測對象產生的聲音並轉換得到的頻譜,或是偵測待檢測對象運作時產生的震動的信息,這些感測數據可成為檢測各種待檢測對象的信息,用以優化驅動待檢測對象運作的控制參數。The disclosure book discloses a method and system for intelligent detection of sensing data. One of the main purposes of the method is to use the deep learning algorithm for initial training and model building, and to cooperate with the machine learning algorithm to perform post-test and training. A detection model for detecting a specific object to be detected is developed. For example, the object to be detected can be a product, a system or a field, and sensors are used to sense specific information of the object to be detected, such as image, sound, vibration, and the obtained sensing data such as photographing the object to be detected The resulting image, the frequency spectrum obtained by recording the sound produced by the object to be detected, or the information of the vibration generated when the object to be detected is detected, these sensing data can be used to detect various objects to be detected. Control parameters that drive the operation of the object to be detected.
實現所述感測數據智能檢測系統的實施例可參考圖1所示的系統架構實施例圖。For an embodiment of realizing the intelligent detection system for sensing data, reference may be made to the system architecture embodiment diagram shown in FIG. 1 .
圖中顯示一待檢測對象10,待檢測對象以控制參數驅動而運作,所述感測數據智能檢測系統能夠通過學習感測數據優化其本身的控制參數,或是優化產生此待檢測對象的系統的控制參數。舉例來說,待檢測對象10通過一或多個感測器(感測器一101、感測器二102與感測器三103)感測待檢測對象10得出感測數據,若以拍攝待檢測對象10產生影像為例,感測器一101、感測器二102與感測器三103可為拍攝多角度的攝影機;若以感測某個場域的環境情形,感測器一101、感測器二102與感測器三103可為設於場域中不同位置感測器。感測器一101、感測器二102與感測器三103產生的感測數據可儲存於感測數據處理主機12,再轉換與初步處理成為電腦系統14中數據庫145中的數據。A to-
電腦系統14設有處理器141、記憶體143以及數據庫145,其中記憶體143用以儲存執行感測數據智能檢測方法的程式集以及演算法,數據庫145用以儲存以一或多個感測器量測待檢測對象10後所得出的感測數據,再以電腦系統14的處理器141執行感測數據智能檢測方法。The
運用上述系統架構,圖2顯示系統所執行的感測數據智能檢測方法的主要流程實施例圖。Using the above-mentioned system architecture, FIG. 2 shows an embodiment of the main flow of the method for intelligent detection of sensing data executed by the system.
系統一開始,如步驟S201,取得自待檢測對象所得出的感測數據,接著如步驟S203,將感測數據區分為訓練數據(training data)與測試數據(testing data),在特定實施例中,可再由訓練數據中得出驗證數據(validation data)。在步驟S205中,對訓練數據執行一深度學習演算法(deep-learning algorithm),可從訓練數據中取得關於待檢測對象的特徵,這些特徵反映出待檢測對象運作或是相關系統運作的資訊,從中學習而建立用以檢測待檢測對象的檢測模型,這是一種基於深度學習的檢測模型。At the beginning of the system, as in step S201, the sensing data obtained from the object to be detected is obtained, and then as in step S203, the sensing data is divided into training data (training data) and testing data (testing data), in a specific embodiment , and the validation data can be obtained from the training data. In step S205, a deep-learning algorithm is executed on the training data, and features about the object to be detected can be obtained from the training data, and these features reflect information about the operation of the object to be detected or the operation of a related system, A detection model for detecting the object to be detected is established by learning from it, which is a detection model based on deep learning.
之後,如步驟S207,以感測數據中選擇的測試數據測試以上步驟得出的檢測模型,檢測結果包括通過測試的部分,以及未通過測試的部分。通過測試的數據表示檢測模型符合預期,表示從目前檢測模型所得出的模型參數為適當,不用調整。然而,若檢測結果包括了未通過測試的部分,在所述感測數據智能檢測方法中,如步驟S209,可針對不通過的測試數據,執行機器學習演算法,並以歷史參數數據訓練以上步驟得出的檢測模型,可以歷史參數數據訓練得出參數模型,再以一模型優化演算法優化檢測模型,此為一種基於機器學習的檢測模型,如步驟S211,可以得出經優化待檢測對象的控制參數。Then, in step S207, the detection model obtained in the above steps is tested with the test data selected from the sensing data, and the detection result includes the part that passed the test and the part that failed the test. The data passing the test indicates that the detection model is in line with expectations, indicating that the model parameters derived from the current detection model are appropriate and do not need to be adjusted. However, if the detection result includes the part that fails the test, in the intelligent detection method for sensing data, as in step S209, a machine learning algorithm can be executed for the test data that fails, and the above steps can be trained with historical parameter data The obtained detection model can be trained with historical parameter data to obtain a parameter model, and then a model optimization algorithm is used to optimize the detection model, which is a detection model based on machine learning. As in step S211, the optimized detection model can be obtained. Control parameters.
機器學習(machine learning)方法涵蓋眾多,如支持向量機(Support vector machine)、隨機森林(Random Forest)、簡單貝葉斯(Naïve Bayes)和深度學習演算法等等。以上所述深度學習演算法為機器學習(machine learning)方法的一種,利用電腦系統中處理器的算力,可以將所得到的大量感測數據透過多個處理層(layer)中的線性或非線性轉換(linear or non-linear transformation),以特徵抽取(feature extraction)步驟取得數據中代表關於待檢測對象的感測數據特性的特徵。Machine learning methods cover many, such as Support Vector Machine, Random Forest, Naïve Bayes, and Deep Learning algorithms, to name a few. The above-mentioned deep learning algorithm is a kind of machine learning method. Using the computing power of the processor in the computer system, the obtained large amount of sensing data can be passed through linear or non-linear processing in multiple processing layers (layers). Linear or non-linear transformation is a feature extraction step to obtain features in the data that represent the characteristics of the sensing data about the object to be detected.
常見深度學習演算法採用了卷積神經網路(Convolution Neural Networks,CNN)或遞迴神經網路(Recurrent neural network,RNN)兩種方法,卷積神經網路可以其卷積層(convolutional layer)裡利用卷積運算過濾感測數據,以電腦處理器算力逐步抽取其中特徵,以求最終能建立模型(如上述檢測模型),能用以檢測系統產生的感測數據。Common deep learning algorithms use two methods: Convolution Neural Networks (CNN) or Recurrent Neural Network (RNN). Convolutional neural networks can be used in their convolutional layers. The sensory data is filtered by convolution operation, and the features are gradually extracted by the computing power of the computer processor, so as to finally establish a model (such as the above detection model), which can be used to detect the sensory data generated by the system.
舉例來說,自待檢測對象取得的感測數據如影像數據、聲音數據、震動數據等,在所述方法中,系統可將感測數據(sensed data)中的一部分(如90%)設為訓練數據(training data),訓練數據通過深度學習演算法法(如CNN、RNN)建立檢測模型,感測數據剩下的一部分(如10%)設為測試數據(testing data),用以測試系統所建立的檢測模型。For example, the sensing data obtained from the object to be detected, such as image data, sound data, vibration data, etc., in the method, the system can set a part (eg, 90%) of the sensed data as Training data, the training data is used to establish a detection model through deep learning algorithms (such as CNN, RNN), and the remaining part of the sensing data (such as 10%) is set as testing data (testing data) to test the system The established detection model.
圖3顯示系統運行感測數據智能檢測方法的流程實施例。FIG. 3 shows a process embodiment of a method for intelligent detection of sensing data in system operation.
在流程一開始,可以自待檢測對象產生感測數據(301),並形成感測數據智能檢測建立模型的感測數據(303),感測數據或其部分(如從感測數據取得的訓練數據)用於執行深度學習演算法(305),以建立一種基於深度學習的檢測模型,再以部分數據(如從感測數據取得的測試數據)測試此檢測模型(307),針對通過測試的部分,即無需後續參數優化的過程,結束流程(309);針對未通過測試的部分,則用以建立一參數模型(311)。At the beginning of the process, sensing data ( 301 ) can be generated from the object to be detected, and the sensing data ( 303 ) for intelligent detection and establishment of the sensing data model, the sensing data or a part thereof (such as training obtained from the sensing data) data) is used to execute a deep learning algorithm (305) to build a detection model based on deep learning, and then test the detection model (307) with partial data (such as test data obtained from sensing data), For the part that does not require subsequent parameter optimization, the process ends ( 309 ); for the part that fails the test, a parameter model is established ( 311 ).
在建立參數模型(311)的過程中,將所述未通過測試的數據再次進行機器學習,從系統中歷史參數數據庫(313)取得歷史參數數據,對參數進行模型訓練,建立參數模型,並以一種模型優化演算法(model optimization)(315)優化檢測模型的模型參數,形成基於機器學習的檢測模型,最終產生經過優化的控制參數(317),能再次匯入待檢測對象,繼續產生感測數據(301)。In the process of establishing the parameter model (311), machine learning is performed again on the data that has not passed the test, the historical parameter data is obtained from the historical parameter database (313) in the system, the model is trained on the parameters, the parameter model is established, and the A model optimization algorithm (315) optimizes the model parameters of the detection model, forms a detection model based on machine learning, and finally generates optimized control parameters (317), which can be imported into the object to be detected again and continue to generate sensing data (301).
在此一提的是,所述控制參數,為驅動待檢測對象運作的參數,或是產生待檢測對象的系統的運作參數。特別的是,在一實施例中,還可從訓練數據中得出驗證數據(validation data),可以在訓練數據經深度學習演算法形成檢測模型時,以此驗證數據驗證檢測模型,以得出用以產生控制參數的一參數模型,之後才以模型優化演算法(315)優化參數模型,再輸出經優化的控制參數。It is mentioned here that the control parameter is a parameter that drives the operation of the object to be detected, or an operation parameter of a system that generates the object to be detected. In particular, in one embodiment, validation data can also be obtained from the training data, and when the training data is used to form a detection model through a deep learning algorithm, the validation data can be used to verify the detection model to obtain A parameter model for generating control parameters is then used to optimize the parameter model with a model optimization algorithm (315), and the optimized control parameters are output.
如此,將經優化的控制參數(317)驅動待檢測對象,再次得出感測數據(301),通過重複所述的感測數據智能檢測方法持續優化待檢測對象的控制參數,直到系統預設的一個期待的狀態。In this way, the optimized control parameters ( 317 ) are driven to the object to be detected, and the sensing data ( 301 ) is obtained again, and the control parameters of the object to be detected are continuously optimized by repeating the intelligent detection method of the sensing data until the system presets an expected state.
圖4接著顯示另一系統運行的流程實施例,此實施例顯示系統中保存持續產生的感測數據(303)將成為歷史感測數據庫(400),這些歷史感測數據庫400繼續進行深度學習(305),利用電腦系統的算力持續訓練數據,優化檢測數據。另一方面,經測試檢測模型後,通過測試的數據,其中的參數可成為歷史參數數據庫(313)的一部分,持續反覆地進行機器學習,優化參數模型。FIG. 4 then shows another example of the flow of the system operation. This embodiment shows that the continuously generated sensing data ( 303 ) stored in the system will become a historical sensing database ( 400 ), and these
根據一實施例,在執行模型優化演算法時,可以採用多種參數模型,例如遺傳演算法(Genetic Algorithms)、粒子群優化演算法(Particle Swarm Optimization)、蟻群優化演算法(Ant Colony Optimization)、模擬退火法(Simulated Annealing)、禁制搜尋法(Tabu Search)、海鷗優化演算法(Seagull Optimization Algorithm)以及貝氏最佳化演算法(Bayesian Optimization)的其中之一。而進行優化的參數模型如梯度提升決策樹模型(Gradient Boosting Decision Tree)、極限梯度提升模型(Extreme Gradient Boosting)、類別提昇模型(Categorical boosting)、輕型GBM模型(Light GBM)、隨機森林模型(Random Forest)、支持向量機模型(Support Vector Machine)、關聯式向量機模型(Relevance vector machine)、簡單貝氏分類模型(Naïve Bayes)、K最近鄰居演算模型(K nearest neighbor)、CNN模型或RNN模型。在一較佳實施例中,系統所採用的模型優化演算法可以採用貝氏最佳化演算法(Bayesian Optimization),深度學習演算法可採用CNN模型,參數模型則可採用極限梯度提升模型(Extreme Gradient Boosting)。According to an embodiment, when executing the model optimization algorithm, various parameter models can be used, such as Genetic Algorithms, Particle Swarm Optimization, Ant Colony Optimization, One of Simulated Annealing, Tabu Search, Seagull Optimization Algorithm and Bayesian Optimization. The optimized parameter models such as Gradient Boosting Decision Tree, Extreme Gradient Boosting, Categorical boosting, Light GBM, Random Forest Forest), Support Vector Machine, Relevance vector machine, Naïve Bayes, K nearest neighbor, CNN or RNN . In a preferred embodiment, the model optimization algorithm adopted by the system can adopt Bayesian Optimization algorithm, the deep learning algorithm can adopt CNN model, and the parameter model can adopt extreme gradient boosting model (Extreme Gradient Boosting). Gradient Boosting).
根據實施例之一,所述感測數據智能檢測方法中可以採用一K折交叉驗證法(K-fold cross-validation),此驗證法可為將訓練數據分為多組(K個)數據,其中一組數據作為驗證模型的數據,其他組數據(K−1個)作為訓練數據,之後輪替K組數據進行K次驗證,最終可得到評估模型的參數。其中,以此K折交叉驗證法從驗證數據評估從多個深度學習演算法產生的多個模型,再從中選出檢測模型,而其中評估多個模型的方式之一為得出各模型的一準確度(accuracy)、一精確度(precision)與一召回率(recall)等因子,包括利用這些因子所得出的評估方程式。According to one of the embodiments, a K-fold cross-validation method may be used in the intelligent detection method of sensing data, and the verification method may be to divide the training data into multiple groups (K) of data, One set of data is used as the data for validating the model, and the other sets of data (K−1) are used as training data. After that, K sets of data are rotated for K times of validation, and finally the parameters of the evaluation model can be obtained. Among them, the K-fold cross-validation method is used to evaluate multiple models generated from multiple deep learning algorithms from the validation data, and then select a detection model from them, and one of the ways to evaluate the multiple models is to obtain an accuracy of each model. factors such as accuracy, precision, and recall, including evaluation equations derived from these factors.
舉例來說,所述K折交叉驗證法,以10折(10-fold)交叉驗證為例,運算時,將所取得的感測數據分為10等分,其中第1分數據可用來作為測試模型用的測試數據,其餘9分可用於訓練數據用。接著,到了下一輪流程,將以第2分數據作為測試檢測模型用的數據,其餘9分同樣用於訓練數據用。以此類推,總共執行10輪流程,可以取得10的準確率,並取得平均值,得到較為客觀的準確率。For example, in the K-fold cross-validation method, taking 10-fold cross-validation as an example, during operation, the acquired sensing data is divided into 10 equal parts, and the first score data can be used as a test The test data for the model, the remaining 9 points can be used for training data. Then, in the next round of the process, the second point data will be used as the data for testing the detection model, and the remaining 9 points will also be used for training data. By analogy, a total of 10 rounds of processes can be performed, an accuracy rate of 10 can be obtained, and an average value can be obtained to obtain a more objective accuracy rate.
在一實施例中,可根據各個檢測模型所預測的結果為是或否(YES/NO)以及實際量測待檢測對象所得到的實際檢測結果為是或否(YES/NO)之間的關聯性得出「準確度」、「精確度」與「召回率」,作為評估檢測模型的依據。In one embodiment, the correlation between the predicted result of each detection model is yes or no (YES/NO) and the actual detection result obtained by actually measuring the object to be detected is yes or no (YES/NO). The "Accuracy", "Precision" and "Recall" are obtained as the basis for evaluating the detection model.
舉例來說,模型預測結果為是(YES)而實際檢測結果也為是(YES)的數據設為「TP」;模型預測結果為是(YES)而實際檢測結果為否(NO)(表示預測錯誤)的數據設為「FP」;模型預測的結果為否(NO)而實際檢測結果為是(YES)(表示預測錯誤)的數據設為「FN」;模型預測為否(NO)且實際測試也為否(NO)的數據設為「TN」。For example, the data whose model prediction result is YES (YES) and the actual detection result is also YES (YES) are set to "TP"; the model prediction result is YES (YES) and the actual detection result is NO (NO) (indicating the prediction error) is set to "FP"; data predicted by the model is NO (NO) and the actual detection result is YES (YES) (indicating a prediction error) is set to "FN"; model prediction is NO (NO) and actual Data for which the test is also NO (NO) is set to "TN".
根據以上設定,所述「準確度」表示模型預測正確的比例,數學式為:“Accuracy=(TP+TN)/Ntotoal ”,也就是:模型預測結果為是而實際檢測結果也為是的數據(”TP”)加上模型預測為否且實際測試也為否的數據(”TN”)除以全部數據(Ntotal )。According to the above settings, the "accuracy" indicates the correct proportion of the model's prediction. The mathematical formula is: "Accuracy=(TP+TN)/N totoal ", that is, the model prediction result is true and the actual detection result is also true. The data ("TP") plus the data for which the model predicted no and the actual test was also no ("TN") divided by the total data (N total ).
所述「精確度」為整個系統要優化檢測模型的主要目標之一,數學式為:”Precision=TP/(TP+FP)”,也就是實際檢測結果為是且模型預測也為是的數量(”TP”)佔了模型預測為是的全部數量(即”TP”+”FP”)的比例。The "accuracy" is one of the main goals for the entire system to optimize the detection model. The mathematical formula is: "Precision=TP/(TP+FP)", that is, the actual detection result is yes and the model prediction is also the number of yes ("TP") accounts for the proportion of the total number (i.e. "TP" + "FP") that the model predicted as yes.
所述「召回率」表示模型的錯誤率,數學式為:”Recall=TP/(TP+FN)”,也就是:模型預測結果為是而實際檢測結果也為是的數據(”TP”)佔了實際檢測為是的全部數據(即”TP”+”FN”)的比例。The "recall rate" represents the error rate of the model, and the mathematical formula is: "Recall=TP/(TP+FN)", that is, the data for which the model prediction result is yes and the actual detection result is also yes ("TP") It accounts for the proportion of all data (ie "TP" + "FN") that are actually detected as yes.
進一步地,更可將以上「準確度」、「精確度」與「召回率」的計算結果進行其他運算(如F1 Score=2/((1/Precision)+(1/Recall)))作為評估模型優劣的依據。Further, other operations (such as F1 Score=2/((1/Precision)+(1/Recall))) can be performed on the calculation results of the above "Accuracy", "Precision" and "Recall" as an evaluation The basis for the pros and cons of the model.
圖5顯示感測數據智能檢測方法中訓練數據、驗證數據與測試數據的運作實施例示意圖,以下並配合圖6所示感測數據智能檢測方法的實施例流程圖進行說明。FIG. 5 shows a schematic diagram of an embodiment of the operation of training data, verification data and test data in the method for intelligent detection of sensing data. The following description will be described with reference to the flowchart of the embodiment of the method for intelligent detection of sensing data shown in FIG. 6 .
系統中數據庫50設有感測數據與參數數據(步驟S601),感測數據主要可分為訓練數據501與測試數據505,而訓練數據501的一部分可用於作為驗證數據503(步驟S603)。基於訓練數據501,以深度學習演算法訓練模型(52),得出一檢測模型(步驟S605)。以測試數據505進行測試模型(56)(步驟S607),以得出用以產生控制參數的參數模型。這時,可以機器學習模型產生控制參數(步驟S609)。另一方面,驗證數據503則是執行模型的驗證,得出一參數模型(步驟S611)。The
根據一實施例,其中可通過模型優化演算法(53)優化參數模型(步驟S613),並可繼續以測試數據測試參數模型(步驟S615),再形成檢測模型(54)。經反覆上述流程,可以從引入不同的深度學習演算法,通過訓練數據501、驗證數據503與測試數據505反覆優化而得出多個檢測模型,以產生控制參數(步驟S617)。再經比較模型57,而根據需求判斷出系統適用的模型(58),並完成所述感測數據智能檢測流程(59)。According to an embodiment, the parameter model can be optimized by the model optimization algorithm ( 53 ) (step S613 ), and the parameter model can be continuously tested with the test data (step S615 ), and then the detection model ( 54 ) is formed. After repeating the above process, a plurality of detection models can be obtained by introducing different deep learning algorithms and repeatedly optimizing the
上述優化模型的方法可參考圖7所示感測數據智能檢測方法中優化的範例流程圖。For the above-mentioned method of optimizing the model, reference may be made to the exemplary flowchart of optimization in the method for intelligent detection of sensing data shown in FIG. 7 .
根據以上實施例描述利用模型優化演算法優化模型以定義出最優檢測模型的流程中,可匯入歷史參數數據(步驟S701),以機器學習演算法訓練這些數據建立參數模型(步驟S703),接著以此參數模型得出控制參數,在逐一以K折交叉驗證法(步驟S705),利用驗證數據測試從多個深度學習演算法產生的多個模型(步驟S707),再從中評估(如採用上述「準確度」、「精確度」與「召回率」等因子)與選出最適合的檢測模型(步驟S709)。之後對所選擇的模型執行模型參數優化演算法(步驟S711),以得出優化後的檢測模型(步驟S713),檢測模型可用來生成最優控制參數。According to the above embodiment, in the process of using the model optimization algorithm to optimize the model to define the optimal detection model, the historical parameter data can be imported (step S701 ), and the data can be trained by the machine learning algorithm to establish the parameter model (step S703 ), Next, the control parameters are obtained from this parameter model, and the K-fold cross-validation method is used one by one (step S705 ), and the validation data is used to test multiple models generated from multiple deep learning algorithms (step S707 ), and then evaluate from them (for example, using The above factors such as "accuracy", "precision" and "recall rate") and select the most suitable detection model (step S709). Then, the model parameter optimization algorithm is performed on the selected model (step S711 ) to obtain an optimized detection model (step S713 ), and the detection model can be used to generate optimal control parameters.
列舉一範例,可參考圖8所示流程圖,此例顯示建立檢測馬達的檢測模型的智能方法。For an example, please refer to the flowchart shown in FIG. 8 , which shows an intelligent method for establishing a detection model for detecting a motor.
通過聲音感測器取得馬達的振動聲音(步驟S801),將音訊轉換為頻譜,建立感測數據庫中的數據(步驟S803),接著以深度學習演算法訓練這些感測數據(如其中取得的訓練數據),抽取數據中關於馬達運作狀況的聲音特徵,得出各種音頻關於馬達運作異常與否以及其控制參數的關聯性,建立用以檢測馬達的檢測模型(步驟S805)。之後,系統利用感測數據中取得的驗證數據驗證上述步驟訓練得出檢測模型(步驟S807),驗證後決定檢測模型(步驟S809)。Acquire the vibration sound of the motor through the sound sensor (step S801 ), convert the audio into a frequency spectrum, create data in the sensing database (step S803 ), and then train the sensing data with a deep learning algorithm (such as the training obtained in the data), extract the sound features related to the motor operation status in the data, obtain the correlation of various audio frequencies related to the abnormal operation of the motor and its control parameters, and establish a detection model for detecting the motor (step S805). Afterwards, the system uses the verification data obtained from the sensing data to verify the above-mentioned training to obtain a detection model (step S807 ), and determines the detection model after verification (step S809 ).
系統即利用此檢測模型產生控制參數,驅動馬達,也同時通過感測器錄製聲音,形成音訊的感測數據,接著檢測這些感測數據(如影像化的頻譜)(步驟S811),從感測數據判斷是否通過測試?(步驟S813),若通過測試,表示以目前的檢測模型決定的控制參數驅動馬達是符合系統所需,即結束此流程(步驟S815)。The system uses the detection model to generate control parameters, drive the motor, and record sound through the sensor at the same time to form audio sensing data, and then detect the sensing data (such as an imaged spectrum) (step S811 ). Does the data judge pass the test? (step S813 ), if the test is passed, it means that driving the motor with the control parameters determined by the current detection model meets the system requirements, and the process ends (step S815 ).
若並未通過測試(否),表示需要持續以參數模型優化參數(步驟S817),再以產生的控制參數驅動馬達運作(步驟S819),持續利用感測器得出震動聲音,經轉頻譜後,由檢測模型檢測數據判斷是否通過檢測,並反覆以上步驟,持續優化參數模型以得出最優的控制參數。If it does not pass the test (No), it means that it is necessary to continuously optimize the parameters with the parameter model (step S817 ), and then use the generated control parameters to drive the motor to operate (step S819 ), and continue to use the sensor to obtain the vibration sound. , according to the detection data of the detection model to determine whether the detection is passed, and the above steps are repeated to continuously optimize the parameter model to obtain the optimal control parameters.
進一步地,根據另一應用的實施例,以所揭露的感測數據智能檢測方法應用在檢測屏幕,以顯示參數驅動屏幕以顯示內容,如上流程,可先以影像感測器取得大量屏幕的影像數據,從中以深度學習演算法取得影像數據中的特徵,建立影像與屏幕顯示參數的關聯性,可以建立檢測屏幕的檢測模型,包括以驗證數據驗證而決定檢測模型,再利用此檢測模型產生顯示參數,並接著產生屏幕的影像數據,進行測試,以及執行後續機器學習演算法優化檢測模型,以得出屏幕經優化的顯示參數。Further, according to another embodiment of the application, the disclosed intelligent detection method of sensing data is applied to the detection screen to display the parameters to drive the screen to display the content. As in the above process, the image sensor can be used to obtain images of a large number of screens first. Data, from which the features in the image data are obtained by deep learning algorithms, the correlation between the image and the screen display parameters can be established, and the detection model of the detection screen can be established, including the verification of the verification data to determine the detection model, and then use this detection model to generate a display parameters, and then generate the image data of the screen for testing, and execute the subsequent machine learning algorithm to optimize the detection model, so as to obtain the optimized display parameters of the screen.
綜上所述,根據以上實施例所描述的感測數據智能檢測方法與系統,特別的是,在方法中訓練數據建立模型時,除了由感測數據中的一部分作為訓練數據(training data)外,也有一部分作為測試模型的數據(testing data),再從訓練數據中取得一部分作為驗證數據(validation data),以驗證系統產生的檢測模型,此為一種內部優化的步驟。更者,在訓練模型時,還可採用多種深度學習與機器學習演算法,產生多種模型,可同樣優化這些模型後,最後進行模型比較、評估後再選擇其中之一模型進入實際應用。To sum up, according to the method and system for intelligent detection of sensory data described in the above embodiments, in particular, when building a model from training data in the method, in addition to using a part of the sensory data as training data (training data) , and part of it is used as testing data for the model, and part of it is obtained from the training data as validation data to verify the detection model generated by the system. This is an internal optimization step. What's more, when training the model, a variety of deep learning and machine learning algorithms can be used to generate a variety of models. After these models can be optimized, the models can be compared and evaluated, and then one of the models can be selected for practical application.
以上所公開的內容僅為本發明的優選可行實施例,並非因此侷限本發明的申請專利範圍,所以凡是運用本發明說明書及圖式內容所做的等效技術變化,均包含於本發明的申請專利範圍內。The contents disclosed above are only preferred feasible embodiments of the present invention, and are not intended to limit the scope of the present invention. Therefore, any equivalent technical changes made by using the contents of the description and drawings of the present invention are included in the application of the present invention. within the scope of the patent.
10:待檢測對象 101:感測器一 102:感測器二 103:感測器三 12:感測數據處理主機 14:電腦系統 141:處理器 143:記憶體 145:數據庫 301:產生感測數據 303:感測數據 305:深度學習演算法 307:測試檢測模型 309:結束 311:參數模型 313:歷史參數數據庫 315:模型優化演算法 317:產生控制參數 400:歷史感測數據庫 50:數據庫 501:訓練數據 503:驗證數據 505:測試數據 52:訓練模型 53:模型優化演算法 54:檢測模型 56:測試模型 57:比較模型 58:判斷模型 59:完成 步驟S201~S211:感測數據智能檢測方法的主要流程 步驟S601~S617:感測數據智能檢測方法的流程 步驟S701~S713:感測數據智能檢測方法中優化流程 步驟S801~S819:感測數據智能檢測方法的流程範例10: Object to be detected 101: Sensor one 102: Sensor two 103: Sensor three 12: Sensing data processing host 14: Computer System 141: Processor 143: Memory 145:Database 301: Generate sensing data 303: Sensing data 305: Deep Learning Algorithms 307: Test Detection Model 309: End 311: Parametric Models 313: Historical parameter database 315: Model Optimization Algorithms 317: Generate control parameters 400: Historical sensing database 50: Database 501: training data 503: Validate data 505: Test data 52: Train the model 53: Model Optimization Algorithms 54: Detection Model 56: Test the model 57: Compare Models 58: Judgment Model 59: Done Steps S201-S211: the main flow of the intelligent detection method for sensing data Steps S601-S617: the flow of the intelligent detection method for sensing data Steps S701-S713: Optimization process in the intelligent detection method of sensing data Steps S801-S819: flow example of the method for intelligent detection of sensing data
圖1顯示感測數據智能檢測系統的系統架構實施例圖;FIG. 1 shows an embodiment diagram of a system architecture of a sensing data intelligent detection system;
圖2顯示感測數據智能檢測方法的主要流程實施例圖;FIG. 2 shows a main process embodiment diagram of a method for intelligent detection of sensing data;
圖3顯示系統運行感測數據智能檢測方法的流程實施例之一;FIG. 3 shows one of the process embodiments of the method for intelligent detection of system operation sensing data;
圖4顯示系統運行感測數據智能檢測方法的流程實施例之二;FIG. 4 shows the second embodiment of the process of the method for intelligent detection of system operation sensing data;
圖5顯示感測數據智能檢測方法中訓練數據、驗證數據與測試數據的運作實施例示意圖;5 shows a schematic diagram of an operation embodiment of training data, verification data and test data in the method for intelligent detection of sensing data;
圖6顯示感測數據智能檢測方法的實施例流程圖;FIG. 6 shows a flowchart of an embodiment of a method for intelligent detection of sensing data;
圖7顯示感測數據智能檢測方法中優化的範例流程圖;以及FIG. 7 shows an exemplary flow chart of optimization in the method of intelligent detection of sensory data; and
圖8顯示應用感測數據智能檢測方法的範例流程圖。FIG. 8 shows an exemplary flow chart of a method for intelligent detection of applied sensing data.
301:產生感測數據301: Generate sensing data
303:感測數據303: Sensing data
305:深度學習演算法305: Deep Learning Algorithms
307:測試檢測模型307: Test Detection Model
309:結束309: End
311:參數模型311: Parametric Models
313:歷史參數數據庫313: Historical parameter database
315:模型優化演算法315: Model Optimization Algorithms
317:產生控制參數317: Generate control parameters
Claims (7)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW109115966A TWI761834B (en) | 2020-05-14 | 2020-05-14 | Intelligent method for testing sensed data and system thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW109115966A TWI761834B (en) | 2020-05-14 | 2020-05-14 | Intelligent method for testing sensed data and system thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
TW202143042A TW202143042A (en) | 2021-11-16 |
TWI761834B true TWI761834B (en) | 2022-04-21 |
Family
ID=80783337
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW109115966A TWI761834B (en) | 2020-05-14 | 2020-05-14 | Intelligent method for testing sensed data and system thereof |
Country Status (1)
Country | Link |
---|---|
TW (1) | TWI761834B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI792868B (en) * | 2021-02-03 | 2023-02-11 | 盟立自動化股份有限公司 | Storage equipment and monitoring method of storage equipment |
CN116026514B (en) * | 2023-03-29 | 2023-06-30 | 武汉理工大学 | Six-dimensional force sensor and nonlinear decoupling fault tolerance method for surgical clamp |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100278420A1 (en) * | 2009-04-02 | 2010-11-04 | Siemens Corporation | Predicate Logic based Image Grammars for Complex Visual Pattern Recognition |
US20150379429A1 (en) * | 2014-06-30 | 2015-12-31 | Amazon Technologies, Inc. | Interactive interfaces for machine learning model evaluations |
US10043064B2 (en) * | 2015-01-14 | 2018-08-07 | Samsung Electronics Co., Ltd. | Method and apparatus of detecting object using event-based sensor |
TWI647586B (en) * | 2017-12-12 | 2019-01-11 | 財團法人資訊工業策進會 | Behavior inference model building apparatus and behavior inference model building method thereof |
-
2020
- 2020-05-14 TW TW109115966A patent/TWI761834B/en active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100278420A1 (en) * | 2009-04-02 | 2010-11-04 | Siemens Corporation | Predicate Logic based Image Grammars for Complex Visual Pattern Recognition |
US20150379429A1 (en) * | 2014-06-30 | 2015-12-31 | Amazon Technologies, Inc. | Interactive interfaces for machine learning model evaluations |
US10043064B2 (en) * | 2015-01-14 | 2018-08-07 | Samsung Electronics Co., Ltd. | Method and apparatus of detecting object using event-based sensor |
TWI647586B (en) * | 2017-12-12 | 2019-01-11 | 財團法人資訊工業策進會 | Behavior inference model building apparatus and behavior inference model building method thereof |
Also Published As
Publication number | Publication date |
---|---|
TW202143042A (en) | 2021-11-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210383235A1 (en) | Neural networks with subdomain training | |
JP7040712B2 (en) | A method of learning and testing a modulation network that conceals the original data for the protection of personal information, and a learning device and a test device using this. | |
US10402448B2 (en) | Image retrieval with deep local feature descriptors and attention-based keypoint descriptors | |
US10605188B2 (en) | Control device, control method, and control program | |
TWI761834B (en) | Intelligent method for testing sensed data and system thereof | |
US10853678B2 (en) | Object recognition method and apparatus | |
CN110472067B (en) | Knowledge graph representation learning method, knowledge graph representation learning device, computer equipment and storage medium | |
US11403878B2 (en) | Apparatus and method with user verification | |
JP2019086475A (en) | Learning program, detection program, learning method, detection method, learning device, and detection device | |
CN113570007B (en) | Method, device and equipment for optimizing construction of part defect identification model and storage medium | |
WO2021200392A1 (en) | Data adjustment system, data adjustment device, data adjustment method, terminal device, and information processing device | |
US20240070449A1 (en) | Systems and methods for expert guided semi-supervision with contrastive loss for machine learning models | |
CN116127383A (en) | Fault detection method and device, electronic equipment and storage medium | |
CN114358274A (en) | Method and apparatus for training neural network for image recognition | |
US11301704B2 (en) | Method and apparatus with image recognition | |
CN107578774A (en) | For promoting the method and system of the detection to time sequence model | |
JP2020135432A (en) | Learning data generation method, learning data generation device, and program | |
CN113688853A (en) | Intelligent detection method and system for sensing data | |
TWM601387U (en) | Intelligent system for testing sensed data | |
JP2021056928A (en) | Optimal solution acquisition program, optimal solution acquisition method, and information processor | |
CN116150648A (en) | Identifying or checking the integrity of machine-learned classification models | |
JP2019086473A (en) | Learning program, detection program, learning method, detection method, learning device, and detection device | |
JPWO2019220481A1 (en) | Judgment rule acquisition device, judgment rule acquisition method and judgment rule acquisition program | |
CN115314239A (en) | Analysis method and related equipment for hidden malicious behaviors based on multi-model fusion | |
JP7363889B2 (en) | Learning devices, learning methods, computer programs and recording media |