CN110084379A - The method that calibration instrument is calibrated is treated using artificial intelligence cloud computing - Google Patents

The method that calibration instrument is calibrated is treated using artificial intelligence cloud computing Download PDF

Info

Publication number
CN110084379A
CN110084379A CN201910379394.9A CN201910379394A CN110084379A CN 110084379 A CN110084379 A CN 110084379A CN 201910379394 A CN201910379394 A CN 201910379394A CN 110084379 A CN110084379 A CN 110084379A
Authority
CN
China
Prior art keywords
calibrated
instrument
value
artificial intelligence
accurate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910379394.9A
Other languages
Chinese (zh)
Inventor
伍学斌
伍学聪
杨小波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Defod Precision Hardware Co Ltd
Original Assignee
Dongguan Defod Precision Hardware Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan Defod Precision Hardware Co Ltd filed Critical Dongguan Defod Precision Hardware Co Ltd
Priority to CN201910379394.9A priority Critical patent/CN110084379A/en
Publication of CN110084379A publication Critical patent/CN110084379A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Automation & Control Theory (AREA)
  • Medical Informatics (AREA)
  • Operations Research (AREA)
  • Probability & Statistics with Applications (AREA)
  • Evolutionary Computation (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Feedback Control In General (AREA)

Abstract

The invention discloses treat the method that calibration instrument is calibrated using artificial intelligence cloud computing, the generally acknowledged accurate device value that newest equipment output valve can be previously known with one group in calibration process, each group of instrument to be calibrated is sent to cloud calculation server by network.And this every group of new measured value can all store beyond the clouds, and by applying artificial intelligence machine learning algorithm, calculate accurate adjusted value by the software in Cloud Server.The adjusted value of one instrument to be calibrated will be compared with the data of every other instrument to be calibrated.Artificial intelligence machine learning algorithm can from learn in data and improve calibration in each of new adjusted value accuracy.In the equipment output valve for calculating future, which will be sent to corresponding instrument to be calibrated by internet.This technology can be applied to entire calibration process on a large amount of instrument to be calibrated, and process is easier more smoothly, in addition, this technology can also be used to improve the accuracy and reliability of a large amount of instruments to be calibrated.

Description

The method that calibration instrument is calibrated is treated using artificial intelligence cloud computing
Technical field
The present invention relates to instrument calibration technical fields, and in particular to calibration instrument is treated in a kind of application artificial intelligence cloud computing The method calibrated, those instruments to be calibrated include: sensor, display, measuring device or control equipment.
Background technique
Currently, requiring to be calibrated there are many different types of instrument in the market, (herein, we unite these instruments Instrument referred to as to be calibrated), and the method that we are proposed can be applied to all these instruments.These instruments to be calibrated Include:
1) measure environment variables (such as temperature) and to the measuring device of the user report numerical value;
2) sensor is sent to elsewhere for measure environment variables and by measurement data, such as cloud;
3) data can be received and show data or image to the display equipment of user;
4) equipment is controlled, data can be received from other equipment and local control or simple movement are done with this.
But the manufacturing process of each instrument to be calibrated is different, therefore the data exported are not necessarily accurately, including surveys Magnitude;The display data or image of output;Some control functions;Herein, these three classes are referred to as equipment and exported by we Value.
Each instrument to be calibrated has required a calibration process, and measured value can just be made more accurately to reflect that actual environment becomes Amount, the data or image of display will also be more accurately displayed to user, and the movement of control function or equipment can also be more quasi- Really.
Summary of the invention
Currently, require to be calibrated there are many different types of instrument in the market, instrument to be calibrated include: sensor, Technical solution is in the market by display, measuring device or control equipment:
The data that at least one or more is measured, with the accurate environmental variance of at least one or more previous into Row comparison;
By between the show value of at least one or more output and the previous accurate show value of at least one or more into Row comparison;
At least one or more control function and at least one or more previous generally acknowledged accurate control function are carried out pair Than;
And the main purpose compared is the equipment output valve exported from instrument to be calibrated and generally acknowledges accurately Generally acknowledge that accurate device value calculates arithmetic difference, and these differences can be used as the adjusted value in calibration process later;
Adjusted value can be applied to instrument to be calibrated, adds or subtracts through by the newest equipment output valve of instrument to be calibrated Adjusted value, to improve the accuracy of new measured value or show value;
Adjusted value needs certain memory mechanism through instrument internal to be calibrated to store somewhere, adjusted value after storage It can easily be extracted again, and the equipment output valve after being applied to, it also improves following equipment output valve in this way Accuracy;
This measurement and adjustment process are known as calibrating, and having the instrument to be calibrated of coming of new to require repetition every time, this was calibrated Journey, because the single-item of each coming of new can have slight manufacturing variation feature.
It is not related between the adjusted value of instrument to be calibrated storage and the adjusted value of another instrument to be calibrated, uniquely with this The adjusted value stored in instrument to be calibrated has relationship, is previously known accurately equipment output valve.
The side that calibration instrument is calibrated is treated using artificial intelligence cloud computing the object of the present invention is to provide a kind of Method.
To achieve the above object, the technical solution adopted by the present invention is that:
The method that calibration instrument is calibrated is treated using artificial intelligence cloud computing,
The data that at least one or more is measured, with the accurate environmental variance of at least one or more previous into Row comparison;
By between the show value of at least one or more output and the previous accurate show value of at least one or more into Row comparison;
At least one or more control function and at least one or more previous generally acknowledged accurate control function are carried out pair Than;
Newest equipment output valve can be with one group of previously known standard in calibration process, each group of new instrument to be calibrated True errorless generally acknowledged accurate device value is sent to cloud calculation server by network and (due to safety concerns, is also possible to local Calculation server), by the artificial intelligence machine learning software in Cloud Server, calculate accurate calibrator (-ter) unit value.Equipment output Value and calibrator (-ter) unit value calculate target prediction function (F), and this target prediction function (F) can be used as the tune in calibration process Whole value.
The adjusted value will be sent to corresponding instrument to be calibrated by internet.
Every group of new measured value can all store beyond the clouds, and the storage of these numerical value is highly important, because it can be helped The accuracy for improving entire algorithm, through the huge data point being collected into conjunction with artificial intelligence machine learning algorithm.
Machine learning, more specifically prediction modeling, concern is primarily with the errors of minimum model, or make most accurately Prediction be possibly realized.Machine learning algorithm is described as learning objective anticipation function (F), it is best by input variable (X) It is mapped to output variable (Y), in other words, based on input (X) prediction output (Y).
Y=F (X)
The algorithm that we apply is one of branch of artificial intelligence machine study, regression analysis.We can also tie Close 1) single argument to return, 2) multivariate regression, 3) linear regression and 4) nonlinear regression.
In order to further illustrate our method, we will be explained with a multiple linear regression model F (X).But It is that identical principle also can be applied to single argument recurrence, linear regression and nonlinear regression.
In building and training pattern, our regression model F (X) is described as follows:
Y=C+M1xX1+M2xX2+M3xX3+ ...+MnxXn
In our scheme, training pattern can be carried out using this multiple linear regression:
Let
Generally acknowledge accurate device value=Y
Equipment output valve=X1
Anticipation function F:
Generally acknowledge accurate device value Y=C+M1x equipment output valve X1+M2xX2+ ...+MnxXn
In order to improve our model, other manufacture related datas further can be applied to the X2 in model by us, X3 ... Xn.
Such as the application of the following contents,
Let
X2=manufactures lot number
X3=manufacturing time stabs (Unix timestamp)
These can be used to that the manufacture part by instrument to be calibrated and particular batch is helped to connect, and by specified manufacture Change and is connected with specified parts member.
In training machine learning model, with more and more numbers from equipment output valve and " generally acknowledging accurate device value " According to being collected and being input in model, anticipation function F is calculated, function F will become more and more accurate, coefficient C, M1, M2 ... Mn will become more and more accurate.This is one of training or study cardinal principle of artificial intelligence machine study.
After we train up model with enough training datas, we can use the model (to use Function F, coefficient C, M1, M2 ... Mn) following artificial intelligence calibrator (-ter) unit value of prediction, and treat calibration instrument and carry out just The calibration of step:
Let
Artificial intelligence calibrator (-ter) unit value=Y
Equipment output valve=X1
Applied forecasting function F is predicted:
Artificial intelligence calibrator (-ter) unit value Y=C+M1x equipment output valve X1+M2xX2+ ...+MnxXn
Coefficient C, M1, M2 ... Mn can download to instrument to be calibrated, and prepare for improving new measured value in future Or the accuracy of show value.
The advantages of this calibration method be can be operated with seldom personnel to calibrate a large amount of instrument to be calibrated because I Save and compare " the generally acknowledging accurate device value " of each new instrument to be calibrated and time and the manpower of equipment output valve.
As second step, also it is each new instrument to be calibrated, compares " generally acknowledging accurate device value " and artificial intelligence Calibrator (-ter) unit value.And the main purpose compared is to calculate arithmetic difference, and these differences are the people of each instrument to be calibrated When there is a large amount of instrument to be calibrated, Cloud Server can be used following formula and calculates perhaps work intelligence calibrator (-ter) unit value error The standard deviation of mostly representative instrument and equipment output valve:
After establishing standard deviation, can relatively it generally acknowledge appointing between accurate device value and artificial intelligence calibrator (-ter) unit value What the following measurement difference.If difference is more than one or two standard deviation, this explanation is not the manufacture of equipment, is exactly operated Some big problems are gone out when member's manual measurement equipment output valve, such case can be labeled as further investigating mainly asks Topic.This step helps to improve the reliability of equipment, and helps to mark any problem in device fabrication.
The invention has the advantages that:
The generally acknowledged standard that newest equipment output valve can be previously known with one group in calibration process, each group of instrument to be calibrated Due to safety concerns, true device value is sent to cloud calculation server (being also possible to local computing server) by network.And This every group of new measured value can all store beyond the clouds, and by applying artificial intelligence machine learning algorithm, by soft in Cloud Server Part calculates accurate adjusted value.
The adjusted value of one instrument to be calibrated will be compared with the data of every other instrument to be calibrated.Artificial intelligence machine Device learning algorithm and software each of will improve in calibration the accuracy and reliability of new adjusted value.In the equipment for calculating future Output valve, the adjusted value will be sent to corresponding instrument to be calibrated by internet.
This technology can be applied to entire calibration process on a large amount of instrument to be calibrated, and process is easier more Smoothly, in addition, this technology can also be used to improve the accuracy and reliability of a large amount of instruments to be calibrated.
In order to explain the structural features and functions of the invention more clearly, come with reference to the accompanying drawing with specific embodiment to its into Row is described in detail.
Detailed description of the invention
Fig. 1 show calibration method schematic diagram of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall within the protection scope of the present invention.
The present embodiment provides application artificial intelligence cloud computings to treat the method that calibration instrument is calibrated,
The data that at least one or more is measured, with the accurate environmental variance of at least one or more previous into Row comparison;
By between the show value of at least one or more output and the previous accurate show value of at least one or more into Row comparison;
At least one or more control function and at least one or more previous generally acknowledged accurate control function are carried out pair Than;
Newest equipment output valve can be with one group of previously known standard in calibration process, each group of new instrument to be calibrated Due to safety concerns, true errorless generally acknowledged accurate device value is sent to cloud calculation server by network and (is also possible to local Calculation server), by the artificial intelligence machine learning software in Cloud Server, calculate accurate calibrator (-ter) unit value.Equipment output Value and calibrator (-ter) unit value calculate target prediction function (F), and this target prediction function (F) can be used as the tune in calibration process Whole value.
The adjusted value will be sent to corresponding instrument to be calibrated by internet.
Every group of new measured value can all store beyond the clouds, and the storage of these numerical value is highly important, because it can be helped The accuracy for improving entire algorithm, through the huge data point being collected into conjunction with artificial intelligence machine learning algorithm.
Machine learning, more specifically prediction modeling, concern is primarily with minimize the error of model or make most accurately Prediction be possibly realized.Machine learning algorithm is described as learning objective anticipation function (F), it is best by input variable (X) It is mapped to output variable (Y), in other words, based on input (X) prediction output (Y).
Y=F (X)
The algorithm that we apply is one of branch of artificial intelligence machine study, regression analysis.In addition, we may be used also To be returned in conjunction with 1) single argument, 2) multivariate regression, 3) linear regression and 4) nonlinear regression.
In order to further illustrate our method, we will be explained with a multiple linear regression model F (X).But It is that identical principle also can be applied to single argument recurrence, linear regression and nonlinear regression.
In building and training pattern, our regression model F (X) is described as follows:
Y=C+M1xX1+M2xX2+M3xX3+ ...+MnxXn
In our scheme, training pattern can be carried out using this multiple linear regression:
Let
Generally acknowledge accurate device value=Y
Equipment output valve=X1
Anticipation function F:
Generally acknowledge accurate device value Y=C+M1x equipment output valve X1+M2xX2+ ...+MnxXn
In order to improve our model, other manufacture related datas further can be applied to the X2 in model by us, X3 ... Xn.
Such as the application of the following contents,
Let
X2=manufactures lot number
X3=manufacturing time stabs (Unix timestamp)
These can be used to that the manufacture part by instrument to be calibrated and particular batch is helped to connect, and by specified manufacture Change and is connected with specified parts member.
In training machine learning model, with more and more numbers from equipment output valve and " generally acknowledging accurate device value " According to being collected and being input in model, anticipation function F is calculated, function F will become more and more accurate, coefficient C, M1, M2 ... Mn will become more and more accurate.This is one of training or study cardinal principle of artificial intelligence machine study.
After we train up model with enough training datas, we can use the model (to use Function F, coefficient C, M1, M2 ... Mn) following artificial intelligence calibrator (-ter) unit value of prediction, and treat calibration instrument and carry out just The calibration of step:
Let
Artificial intelligence calibrator (-ter) unit value=Y
Equipment output valve=X1
Applied forecasting function F is predicted:
Artificial intelligence calibrator (-ter) unit value Y=C+M1x equipment output valve X1+M2xX2+ ...+MnxXn
Coefficient C, M1, M2 ... Mn can download to instrument to be calibrated, and prepare for improving new measured value in future Or the accuracy of show value.
The advantages of this calibration method be can be operated with seldom personnel to calibrate a large amount of instrument to be calibrated because I Save and compare " the generally acknowledging accurate device value " of each new instrument to be calibrated and time and the manpower of equipment output valve.
As second step, also it is each new instrument to be calibrated, compares generally acknowledged accurate device value and artificial intelligence school Quasi- device value.And the main purpose compared is to calculate arithmetic difference, and these differences are the artificial of each instrument to be calibrated For intelligent calibrator (-ter) unit value error when there is a large amount of instrument to be calibrated, following formula calculating is can be used in our Cloud Server The standard deviation of many representative instrument and equipment output valves out:
After establishing standard deviation, can relatively it generally acknowledge appointing between accurate device value and artificial intelligence calibrator (-ter) unit value What the following measurement difference.If difference is more than one or two standard deviation, this explanation is not the manufacture of equipment, is exactly operated Some big problems are gone out when member's manual measurement equipment output valve, such case can be labeled as further investigating mainly asks Topic.This step helps to improve the reliability of equipment, and helps to mark any problem in device fabrication.
Design focuses on:
In calibration process, each group of instrument to be calibrated newest equipment output valve can with one group it is previously known " generally acknowledge Accurate device value " is sent to cloud calculation server by network.And this every group of new measured value can all store beyond the clouds, and pass through Using artificial intelligence machine learning algorithm, accurate adjusted value is calculated by the software in Cloud Server.
The adjusted value of one instrument to be calibrated will be compared with the data of every other instrument to be calibrated.Artificial intelligence machine Device learning algorithm and software each of will improve in calibration the accuracy and reliability of new adjusted value.In the equipment for calculating future Output valve, the adjusted value will be sent to corresponding instrument to be calibrated by internet.
This technology can be applied to entire calibration process on a large amount of instrument to be calibrated, and process is easier more Smoothly, in addition, this technology can also be used to improve the accuracy and reliability of a large amount of instruments to be calibrated.
Shown in Fig. 1,100 is generally acknowledge fine instrument, and 200 be instrument to be calibrated, and 201 be measuring appliance, 202 sensors, 203 It has been shown that, 204 controller of device, 300 be wire/radio network communicator, and 400 be wire/wireless local network, 500 routers/solution Tune device, 501 internets, 600 local storages, 601 local servomechanisms, 602 artificial intelligence calculators, 700 clouds calculate, and 701 Cloud server, 702 cloud memories, 703 artificial intelligence calculators.
The above described is only a preferred embodiment of the present invention, being not intended to limit the present invention in any form.Appoint What those skilled in the art, without departing from the scope of the technical proposal of the invention, all using the side of the disclosure above Method and technology contents make many possible changes and modifications to technical solution of the present invention, or are revised as the equivalent reality of equivalent variations Apply example.Therefore anything that does not depart from the technical scheme of the invention, equivalent change made by shape according to the present invention, construction and principle Change, should all be covered by protection scope of the present invention.

Claims (4)

1. a kind of treat the method that calibration instrument is calibrated using artificial intelligence cloud computing, it is characterised in that:
The data that at least one or more is measured carry out pair with the accurate environmental variance of at least one or more previous Than;
It will carry out between the show value and the previous accurate show value of at least one or more of at least one or more output pair Than;
At least one or more control function and at least one or more previous generally acknowledged accurate control function are compared;
The purpose compared is that the equipment output valve exported from instrument to be calibrated and generally acknowledged accurately generally acknowledging accurately are set Standby value calculates arithmetic difference, the adjusted value that can be used as in calibration process after these differences;
Adjusted value can be applied to instrument to be calibrated, add or subtract adjustment through by the newest equipment output valve of instrument to be calibrated Value, to improve the accuracy of new measured value or show value;
Adjusted value is stored through the memory mechanism of instrument internal to be calibrated, and the adjusted value can be mentioned easily again after storage It takes, and is applied to equipment output valves different later, it also improves the accuracy of following equipment output valve in this way;
This measurement and adjustment process are known as calibrating, and have the instrument to be calibrated of coming of new to require to repeat this calibration process every time, Because the single-item of each coming of new can have slight manufacturing variation feature;
It is not related between the adjusted value of instrument to be calibrated storage and the adjusted value of another instrument to be calibrated, uniquely with this to school The adjusted value stored in quasi- instrument has relationship, is previously known accurately equipment output valve.
2. treating the method that calibration instrument is calibrated using artificial intelligence cloud computing according to claim 1, feature exists In:
The accurate nothing that newest equipment output valve can be previously known with one group in calibration process, each group of new instrument to be calibrated Generally acknowledged accurate device value accidentally is sent to cloud calculation server or local computing server by network, by Cloud Server Artificial intelligence machine learning software calculates accurate calibrator (-ter) unit value, and it is pre- that equipment output valve and calibrator (-ter) unit value calculate target It surveys function (F), and this target prediction function (F) can be used as the adjusted value in calibration process;
The adjusted value will be sent to corresponding instrument to be calibrated by internet,
Every group of new measured value can all store the accuracy for helping improve entire algorithm beyond the clouds, through the huge number being collected into Strong point is in conjunction with artificial intelligence machine learning algorithm;
Machine learning, more specifically prediction modeling, concern is primarily with the errors of minimum model, or make most accurately pre- Survey is possibly realized, and machine learning algorithm is described as learning objective anticipation function (F), and input variable (X) is mapped to output and is become It measures (Y), based on input (X) prediction output (Y), Y=F (X);
The algorithm of application is one of branch of artificial intelligence machine study, regression analysis;It can be combined with single argument recurrence, Multivariate regression, linear regression and nonlinear regression;
In order to further illustrate this method, will be explained with a multiple linear regression model F (X), still, identical principle Also single argument recurrence, linear regression and nonlinear regression be can be applied to;
In building and training pattern, regression model F (X) is described as follows:
Y=C+M1xX1+M2xX2+M3xX3+...+MnxXn
It can carry out training pattern using this multiple linear regression:
Let
Generally acknowledge accurate device value=Y
Equipment output valve=X1
Anticipation function F:
Generally acknowledge accurate device value Y=C+M1x equipment output valve X1+M2xX2+...+MnxXn
In order to improve model, other manufacture related datas further can be applied to X2, X3 in model ... Xn;
Such as the application of the following contents,
Let
X2=manufactures lot number
X3=manufacturing time stabs (Unix timestamp)
These can be used to that the manufacture part by instrument to be calibrated and particular batch is helped to connect, and specified manufacture is changed It is connected with specified parts member;
In training machine learning model, as more and more data from equipment output valve and generally acknowledged accurate device value are received Collecting and be input in model, calculates anticipation function F, function F will become more and more accurate, coefficient C, M1, M2 ... Mn will Become more and more accurate;
After being trained up with enough training datas to model, so that it may the model is used, using function F, coefficient C, M1, M2 ... Mn predicts following calibrator (-ter) unit value, and treats calibration instrument and carry out preliminary calibration.
3. according to claim 1 or claim 2 treat the method that calibration instrument is calibrated, feature using artificial intelligence cloud computing It is:
For each new instrument to be calibrated, generally acknowledged accurate device value and artificial intelligence calibrator (-ter) unit value are compared, and is carried out pair The main purpose of ratio is to calculate arithmetic difference, and the artificial intelligence calibrator (-ter) unit value that these differences are each instruments to be calibrated is missed When there is a large amount of instrument to be calibrated, Cloud Server can be used following formula and calculate many representative instruments to be set poor The standard deviation of standby output valve:
Sample variance:
Population variance:
Sample standard deviation:
Parent standard deviation:
After establishing standard deviation, any future between accurate device value and artificial intelligence calibrator (-ter) unit value can be relatively generally acknowledged Difference is measured, if difference is more than one or two standard deviation, it is exactly that operator is manual that this explanation, which is not the manufacture of equipment, Go out some big problems when measuring device output valve, such case can be labeled as the main problem further investigated, this Step helps to improve the reliability of equipment, and helps to mark any problem in device fabrication.
4. the method that calibration instrument is calibrated according to claim 1, is treated using artificial intelligence cloud computing described in one of 2 or 3, It is characterized by: the instrument to be calibrated includes: sensor, display, measuring device or control equipment.
CN201910379394.9A 2019-05-08 2019-05-08 The method that calibration instrument is calibrated is treated using artificial intelligence cloud computing Pending CN110084379A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910379394.9A CN110084379A (en) 2019-05-08 2019-05-08 The method that calibration instrument is calibrated is treated using artificial intelligence cloud computing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910379394.9A CN110084379A (en) 2019-05-08 2019-05-08 The method that calibration instrument is calibrated is treated using artificial intelligence cloud computing

Publications (1)

Publication Number Publication Date
CN110084379A true CN110084379A (en) 2019-08-02

Family

ID=67419181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910379394.9A Pending CN110084379A (en) 2019-05-08 2019-05-08 The method that calibration instrument is calibrated is treated using artificial intelligence cloud computing

Country Status (1)

Country Link
CN (1) CN110084379A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110672143A (en) * 2019-11-07 2020-01-10 徐州陀微传感科技有限公司 Sensor calibration method
CN111044460A (en) * 2019-12-31 2020-04-21 常州罗盘星检测科技有限公司 Calibration method of artificial intelligence instrument
CN111338304A (en) * 2020-03-02 2020-06-26 顺忠宝智能科技(深圳)有限公司 Method for real-time prediction and information communication of production line yield by applying artificial intelligence cloud computing
CN114935354A (en) * 2022-03-05 2022-08-23 深圳天溯计量检测股份有限公司 Calibration method of position type adjusting instrument

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206847849U (en) * 2017-05-17 2018-01-05 福建省正毅工业计量站有限公司 Torque lever calibrating installation
CN108469273A (en) * 2018-02-27 2018-08-31 济宁中科云天环保科技有限公司 High in the clouds data joint debugging calibration method based on machine learning algorithm
CN109631973A (en) * 2018-11-30 2019-04-16 苏州数言信息技术有限公司 A kind of automatic calibrating method and system of sensor
US20190121782A1 (en) * 2017-10-19 2019-04-25 International Business Machines Corporation Adaptive calibration of sensors through cognitive learning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206847849U (en) * 2017-05-17 2018-01-05 福建省正毅工业计量站有限公司 Torque lever calibrating installation
US20190121782A1 (en) * 2017-10-19 2019-04-25 International Business Machines Corporation Adaptive calibration of sensors through cognitive learning
CN108469273A (en) * 2018-02-27 2018-08-31 济宁中科云天环保科技有限公司 High in the clouds data joint debugging calibration method based on machine learning algorithm
CN109631973A (en) * 2018-11-30 2019-04-16 苏州数言信息技术有限公司 A kind of automatic calibrating method and system of sensor

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110672143A (en) * 2019-11-07 2020-01-10 徐州陀微传感科技有限公司 Sensor calibration method
CN111044460A (en) * 2019-12-31 2020-04-21 常州罗盘星检测科技有限公司 Calibration method of artificial intelligence instrument
CN111338304A (en) * 2020-03-02 2020-06-26 顺忠宝智能科技(深圳)有限公司 Method for real-time prediction and information communication of production line yield by applying artificial intelligence cloud computing
CN114935354A (en) * 2022-03-05 2022-08-23 深圳天溯计量检测股份有限公司 Calibration method of position type adjusting instrument

Similar Documents

Publication Publication Date Title
CN110084379A (en) The method that calibration instrument is calibrated is treated using artificial intelligence cloud computing
US20060259198A1 (en) Intelligent system for detection of process status, process fault and preventive maintenance
Graham et al. Modeling the ready mixed concrete delivery system with neural networks
CN101923340A (en) The method and apparatus of the procedure quality in the forecasting process control system
Hadjiiski et al. A comparison of modeling nonlinear systems with artificial neural networks and partial least squares
CN104239659A (en) Carbon steel corrosion rate prediction method of back propagation (BP) neural network
Taylor et al. Building price-level forecasting: an examination of techniques and applications
Herger et al. Ensemble optimisation, multiple constraints and overconfidence: a case study with future Australian precipitation change
Jiang et al. Dynamic measurement errors prediction for sensors based on firefly algorithm optimize support vector machine
Ruhm Measurement plus observation–A new structure in metrology
Bal et al. A comparison of different model selection criteria for forecasting EURO/USD exchange rates by feed forward neural network
CN112270124A (en) Real-time irrigation method and system
US9182752B2 (en) Method and system for multi-zone modeling to determine material properties in storage tanks
Abbasi Ganji et al. Fuzzy process capability indices for simple linear profile
CN106446405B (en) A kind of integrated circuit device neural net model establishing Method of Sample Selection and device
CN114528764A (en) Soft measurement modeling method and device based on integral optimization and instant learning
Mousavi et al. Simulation-based real-time performance monitoring (simmon): A platform for manufacturing and healthcare systems
CN108921434A (en) A method of user capability prediction is completed by human-computer interaction
Suryono et al. Web-based fuzzy time series for environmental temperature and relative humidity prediction
Budylina et al. Methods to ensure the reliability of measurements in the age of Industry 4.0
Sharipov Application of Artificial Neural Networks for Moisture Control
Jayakumar et al. Application of machine learning on crop yield prediction in agriculture enforcement
Eren Calibrations in Process Control
Danesh et al. A combinatorial algorithm for fuzzy parameter estimation with application to uncertain measurements
Macchietto et al. Monitoring and on-line optimisation of processes using speedup

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination