CN113686025B - Bubble water preparation method and device, computer equipment and readable storage medium - Google Patents

Bubble water preparation method and device, computer equipment and readable storage medium Download PDF

Info

Publication number
CN113686025B
CN113686025B CN202110999825.9A CN202110999825A CN113686025B CN 113686025 B CN113686025 B CN 113686025B CN 202110999825 A CN202110999825 A CN 202110999825A CN 113686025 B CN113686025 B CN 113686025B
Authority
CN
China
Prior art keywords
preparation effect
training
verification
sample
training sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110999825.9A
Other languages
Chinese (zh)
Other versions
CN113686025A (en
Inventor
罗淦恩
王�琦
潘叶江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vatti Co Ltd
Original Assignee
Vatti Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vatti Co Ltd filed Critical Vatti Co Ltd
Priority to CN202110999825.9A priority Critical patent/CN113686025B/en
Publication of CN113686025A publication Critical patent/CN113686025A/en
Application granted granted Critical
Publication of CN113686025B publication Critical patent/CN113686025B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24HFLUID HEATERS, e.g. WATER OR AIR HEATERS, HAVING HEAT-GENERATING MEANS, e.g. HEAT PUMPS, IN GENERAL
    • F24H9/00Details
    • F24H9/20Arrangement or mounting of control or safety devices
    • F24H9/2007Arrangement or mounting of control or safety devices for water heaters
    • F24H9/2035Arrangement or mounting of control or safety devices for water heaters using fluid fuel
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Thermal Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Testing And Monitoring For Control Systems (AREA)

Abstract

The application relates to a preparation method and device of bubble water, computer equipment and a readable storage medium. The method comprises the following steps: acquiring a current water temperature, a current water pressure and a pre-stored air pressure set; inputting the current water temperature, the current water pressure and the air pressure into a pre-trained preparation effect prediction model aiming at each air pressure in the air pressure set, and outputting a preparation effect value corresponding to the air pressure; and determining the air pressure corresponding to the minimum preparation effect value as a target air pressure, and preparing bubble water according to the current water temperature, the current water pressure and the target air pressure. By adopting the method and the device, the preparation effect of the bubble water can be improved.

Description

Bubble water preparation method and device, computer equipment and readable storage medium
Technical Field
The application relates to the technical field of computers, in particular to a preparation method and device of bubble water, computer equipment and a readable storage medium.
Background
At present, bubble water is applied to a gas water heater, so that better bathing experience can be brought to a user. The preparation principle of the bubble water is to fully mix water and air, and the quality of the preparation effect of the bubble water depends on whether the matching of water temperature, water pressure and air pressure is proper or not. If the matching is proper, the preparation effect of the bubble water is good, the diameter of the bubbles is small, and the number of the bubbles is large, otherwise, the preparation effect of the bubble water is poor, the diameter of the bubbles is large, and the number of the bubbles is small. Therefore, a method for preparing bubble water is needed to improve the preparation effect of bubble water.
Disclosure of Invention
In view of the above, it is necessary to provide a method and an apparatus for preparing bubble water, a computer device and a readable storage medium.
In a first aspect, there is provided a method for preparing bubble water, the method comprising:
acquiring a current water temperature, a current water pressure and a pre-stored air pressure set;
aiming at each air pressure in the air pressure set, inputting the current water temperature, the current water pressure and the air pressure into a preparation effect prediction model trained in advance, and outputting a preparation effect value corresponding to the air pressure;
and determining the air pressure corresponding to the minimum preparation effect value as a target air pressure, and preparing bubble water according to the current water temperature, the current water pressure and the target air pressure.
As an optional implementation, the method further comprises:
acquiring a pre-stored training sample set, wherein the training sample set comprises training samples and training preparation effect values corresponding to the training samples;
for each training sample in the training sample set, inputting the training sample into a preparation effect prediction model to be trained, and outputting a first prediction preparation effect value corresponding to the training sample;
if the average value of the first predicted preparation effect values is smaller than or equal to a preset target predicted preparation effect value, judging that the preparation effect prediction model to be trained is trained;
and if the average value of the first predicted preparation effect values is larger than a preset target predicted preparation effect value, adjusting model parameters of the preparation effect prediction model to be trained according to the training sample, the training preparation effect value corresponding to the training sample, the first predicted preparation effect value corresponding to the training sample, a preset model learning rate and a preset model parameter adjusting algorithm, executing the step of inputting the training sample to the preparation effect prediction model to be trained aiming at each training sample in the training sample set, and outputting the first predicted preparation effect value corresponding to the training sample.
As an optional implementation manner, the adjusting, according to the training sample, the training preparation effect value corresponding to the training sample, the first prediction preparation effect value corresponding to the training sample, the preset model learning rate, and the preset model parameter adjustment algorithm, the model parameter of the preparation effect prediction model to be trained includes:
determining a training deviation value according to the average value of the training preparation effect values corresponding to the training samples and the average value of the first prediction preparation effect values corresponding to the training samples;
determining a partial derivative corresponding to a model parameter according to the training sample, the average value of the training preparation effect values corresponding to the training sample, the average value of the first prediction preparation effect values corresponding to the training sample and the training deviation value;
and updating the model parameters of the preparation effect prediction model to be trained according to the model parameters, the partial derivatives corresponding to the model parameters and the model learning rate.
As an optional implementation, the method further comprises:
acquiring a pre-stored verification sample set, wherein the verification sample set comprises verification samples and verification preparation effect values corresponding to the verification samples;
for each verification sample in the verification sample set, inputting the verification sample into a preparation effect prediction model to be verified, and outputting a second prediction preparation effect value corresponding to the verification sample;
if the absolute difference value of the second predicted preparation effect value corresponding to the verification sample and the verification preparation effect value corresponding to the verification sample is smaller than or equal to a preset difference threshold value, determining that the verification sample is a target verification sample;
and if the ratio of the number of the target verification samples to the total number of the verification samples is greater than or equal to a preset ratio threshold, judging that the preparation effect prediction model to be verified meets the training requirement.
In a second aspect, there is provided an apparatus for preparing bubble water, the apparatus comprising:
the first acquisition module is used for acquiring the current water temperature, the current water pressure and a pre-stored air pressure set;
a first output module, configured to input, for each air pressure in the air pressure set, the current water temperature, the current water pressure, and the air pressure to a pre-trained preparation effect prediction model, and output a preparation effect value corresponding to the air pressure;
and the first determining module is used for determining the air pressure corresponding to the minimum preparation effect value as a target air pressure and preparing bubble water according to the current water temperature, the current water pressure and the target air pressure.
As an optional implementation, the apparatus further comprises:
the second acquisition module is used for acquiring a pre-stored training sample set, wherein the training sample set comprises training samples and training preparation effect values corresponding to the training samples;
the second output module is used for inputting the training sample to a preparation effect prediction model to be trained aiming at each training sample in the training sample set and outputting a first prediction preparation effect value corresponding to the training sample;
the first judgment module is used for judging that the training of the preparation effect prediction model to be trained is finished if the average value of the first prediction preparation effect values is less than or equal to a preset target prediction preparation effect value;
and a second determining module, configured to, if the average value of the first predicted preparation effect values is greater than a preset target predicted preparation effect value, adjust a model parameter of the to-be-trained preparation effect prediction model according to the training sample, the training preparation effect value corresponding to the training sample, the first predicted preparation effect value corresponding to the training sample, a preset model learning rate, and a preset model parameter adjustment algorithm, and trigger the second output module to execute the step of inputting the training sample to the to-be-trained preparation effect prediction model for each training sample in the training sample set, and outputting the first predicted preparation effect value corresponding to the training sample.
As an optional implementation manner, the second determining module is specifically configured to:
determining a training deviation value according to the average value of the training preparation effect values corresponding to the training samples and the average value of the first prediction preparation effect values corresponding to the training samples;
determining a partial derivative corresponding to a model parameter according to the training sample, the average value of the training preparation effect values corresponding to the training sample, the average value of the first prediction preparation effect values corresponding to the training sample and the training deviation value;
and updating the model parameters of the preparation effect prediction model to be trained according to the model parameters, the partial derivatives corresponding to the model parameters and the model learning rate.
As an optional implementation, the apparatus further comprises:
the third acquisition module is used for acquiring a pre-stored verification sample set, wherein the verification sample set comprises verification samples and verification preparation effect values corresponding to the verification samples;
the third output module is used for inputting each verification sample in the verification sample set to a preparation effect prediction model to be verified and outputting a second prediction preparation effect value corresponding to the verification sample;
a third determining module, configured to determine that the verification sample is a target verification sample if an absolute difference between a second predicted preparation effect value corresponding to the verification sample and a verification preparation effect value corresponding to the verification sample is less than or equal to a preset difference threshold;
and the second judgment module is used for judging that the preparation effect prediction model to be verified meets the training requirement if the ratio of the number of the target verification samples to the total number of the verification samples is greater than or equal to a preset ratio threshold.
In a third aspect, a computer device is provided, comprising a memory and a processor, the memory having stored thereon a computer program operable on the processor to, when executed, perform the method steps of the first aspect.
In a fourth aspect, a computer-readable storage medium is provided, having stored thereon a computer program which, when being executed by a processor, carries out the method steps of the first aspect.
The application provides a preparation method and device of bubble water, computer equipment and a readable storage medium. The gas water heater obtains the current water temperature, the current water pressure and a prestored air pressure set. Then, aiming at each air pressure in the air pressure set, the gas water heater inputs the current water temperature, the current water pressure and the air pressure into a preparation effect prediction model trained in advance, and outputs a preparation effect value corresponding to the air pressure. And then, the gas water heater determines the air pressure corresponding to the minimum preparation effect value as a target air pressure, and prepares bubble water according to the current water temperature, the current water pressure and the target air pressure. Thus, the effect of preparing bubble water can be improved.
Drawings
FIG. 1 is a schematic structural diagram of a bubble water generator according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a method for preparing bubble water according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a preparation effect prediction model provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of a sigmoid activation function according to an embodiment of the present application;
FIG. 5 is a flowchart of a training method for preparing an effect prediction model according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a training sample set provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of model parameters provided in an embodiment of the present application;
fig. 8 is a flowchart of a verification method for preparing an effect prediction model according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a verification sample set provided by an embodiment of the present application;
FIG. 10 is a schematic structural diagram of an apparatus for preparing bubble water according to an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clearly understood, the present application is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of and not restrictive on the broad application.
Fig. 1 is a schematic structural diagram of a bubble water generator according to an embodiment of the present disclosure. As shown in fig. 1, the water inlet of the bubble water generator is connected with the water inlet pipe, the air inlet is connected with the air inlet pipe, the air inlet pipe is connected with the compression motor (not shown in fig. 1), and the electric control board of the gas water heater can control the rotating speed of the compression motor so as to control the air pressure. Meanwhile, the electric control board of the gas water heater can control a rotating motor (not shown in fig. 1) to drive an impeller in the bubble water generator to rotate, so that water and air are fully mixed, and therefore bubble water is prepared. Wherein, the quality of the bubble water preparation effect depends on whether the matching of the water temperature, the water pressure and the air pressure is proper or not. If the matching is proper, the preparation effect of the bubble water is good, the diameter of the bubbles is small, and the number of the bubbles is large, otherwise, the preparation effect of the bubble water is poor, the diameter of the bubbles is large, and the number of the bubbles is small.
The following will explain in detail a method for preparing bubble water provided in the examples of the present application with reference to specific embodiments, as shown in fig. 2, the specific steps are as follows:
step 201, acquiring a current water temperature, a current water pressure and a pre-stored air pressure set.
In the implementation, because the bubble water generator is arranged inside the gas water heater, the water inlet of the bubble water generator is connected with the water inlet pipe of the gas water heater. Therefore, when the gas water heater needs to prepare the bubble water, the gas water heater can acquire the current water temperature through the water inlet temperature sensor, acquire the current water pressure through the water inlet flow sensor, and acquire the air pressure set stored in the gas water heater in advance. Wherein, the air pressure set comprises a plurality of air pressures which can be set by technicians according to the output air pressure of the compression motor. For example, the pressure set is {0.2mPa,0.3mPa,0.4mPa,0.5mPa,0.6mPa,0.7mPa,0.8mPa }.
Step 202, inputting the current water temperature, the current water pressure and the air pressure into a pre-trained preparation effect prediction model for each air pressure in the air pressure set, and outputting a preparation effect value corresponding to the air pressure.
In implementation, fig. 3 is a schematic diagram of a preparation effect prediction model provided in an embodiment of the present application. As shown in fig. 3, the production effect prediction model includes an input layer, a hidden layer, and an output layer. The input layer includes 3 input parameters (x 1, x2, and x 3) and 6 weight coefficients (ω 11, ω 12, ω 21, ω 22, ω 31, and ω 32). Wherein x1 represents water pressure, x2 represents water temperature, and x3 represents air pressure; the weight coefficients ω 11 and ω 12 correspond to x1, the weight coefficients ω 21 and ω 22 correspond to x2, and the weight coefficients ω 31 and ω 32 correspond to x 3. The hidden layer includes 2 intermediate parameters (a 1 and a 2), 2 thresholds (b 1 and b 2), and 1 activation function. The sigmoid activation function may be selected as the activation function, and other types of activation functions may also be selected as well, which is not limited in the embodiment of the present application. Fig. 4 is a schematic diagram of a sigmoid activation function according to an embodiment of the present application. As shown in FIG. 4, the interval of the sigmoid activation function is [0,1]. The output layer includes 2 intermediate parameters (z 1 and z 2), 2 weight coefficients (ω 1 and ω 2), 1 threshold (b), and 1 output parameter (y). Wherein, the weight coefficient corresponding to z1 is ω 1, the weight coefficient corresponding to z2 is ω 2, and y represents the preparation effect value. Based on the structure of the production effect prediction model, the model parameters of the production effect prediction model have corresponding relationships from the following formula (1) to the following formula (5). The formula corresponding to the sigmoid activation function adopted by the activation function is formula (6), and the partial derivative formula is formula (7).
Formula (1): a1= ω 11x1+ ω 21x2+ ω 31x3+ b1;
formula (2): a2= ω 12x1+ ω 22x2+ ω 32x3+ b2;
formula (3): z1= Φ (a 1);
formula (4): z2= Φ (a 2);
formula (5): y = ω 1z1+ ω 2z2 b;
formula (6):
Figure BDA0003229368990000081
formula (7): phi' (x) = phi (x) (1-phi (x));
after the gas water heater acquires the current water temperature, the current water pressure and the pre-stored air pressure set, the current water temperature, the current water pressure and the air pressure can be input into a pre-trained preparation effect prediction model according to each air pressure in the air pressure set. Correspondingly, the pre-trained preparation effect prediction model outputs a preparation effect value corresponding to the air pressure. For example, the current water pressure is 0.2mpa, the water temperature is 21 ℃, and the air pressure set is {0.2mpa,0.3mpa,0.4mpa,0.5mpa,0.6mpa,0.7mpa,0.8mpa }, and the pre-trained preparative effect prediction models output preparative effect values of 2.8, 2.6, 1.9, 1.8, 1.5, 0.7, and 0.9, respectively.
And 203, determining the air pressure corresponding to the minimum preparation effect value as a target air pressure, and preparing bubble water according to the current water temperature, the current water pressure and the target air pressure.
In practice, a smaller value of the production effect in the present application indicates a better production effect. Therefore, after the gas water heater obtains the preparation effect values corresponding to the air pressures, the air pressure corresponding to the minimum preparation effect value can be determined as the target air pressure. And then, the gas water heater can control the compression motor to output target air pressure, and bubble water is prepared according to the current water temperature, the current water pressure and the target air pressure. For example, if the set of air pressures is {0.2mpa,0.3mpa,0.4mpa,0.5mpa,0.6mpa,0.7mpa,0.8mpa }, and the pre-trained production effect prediction models output production effect values of 2.8, 2.6, 1.9, 1.8, 1.5, 0.7, and 0.9, respectively, the minimum production effect value is 0.7, and the target air pressure is 0.7mpa.
The embodiment of the application further provides a training method for preparing the effect prediction model, as shown in fig. 5, the specific processing procedures are as follows:
step 501, a pre-stored training sample set is obtained. The training sample set comprises training samples and training preparation effect values corresponding to the training samples.
In implementation, when an initial production effectiveness prediction model (i.e., a production effectiveness prediction model to be trained) needs to be trained, the computer device may obtain a pre-stored set of training samples. The training sample set comprises training samples and training preparation effect values corresponding to the training samples, and the training samples comprise sample water pressure, sample water temperature and sample air pressure. Fig. 6 is a schematic diagram of a training sample set according to an embodiment of the present application. As shown in fig. 6, the training sample set includes 10 training samples (i.e., sample 1 to sample 10 in fig. 6) and a training preparation effect value (i.e., Y in fig. 6) corresponding to each training sample, where each training sample includes a training water pressure (i.e., x1 in fig. 6), a training water temperature (i.e., x2 in fig. 6), and a training air pressure (i.e., x3 in fig. 6).
Step 502, aiming at each training sample in the training sample set, inputting the training sample into a preparation effect prediction model to be trained, and outputting a first prediction preparation effect value corresponding to the training sample.
In implementation, after the computer device obtains the training sample set, for each training sample in the training sample set, the training sample may be input to the preparation effect prediction model to be trained. Correspondingly, the preparation effect prediction model to be trained outputs a first preparation effect prediction value corresponding to the training sample.
Step 503, if the average value of the first predicted preparation effect values is less than or equal to the preset target predicted preparation effect value, it is determined that the training of the preparation effect prediction model to be trained is completed.
In implementation, if the average value of the first predicted preparation effect values corresponding to each training sample is less than or equal to the preset target predicted preparation effect value, it indicates that the preparation effect prediction model to be trained has met the training requirements. Accordingly, the computer device may determine that the training of the preparation effect prediction model to be trained is complete. The target predictive preparation effect value may be set by a technician according to an actual situation, and the embodiment of the present application is not limited. For example, the target predictive preparative effect value may be set to 0.00000044.
Step 504, if the average value of the first predicted preparation effect values is greater than the preset target predicted preparation effect value, adjusting model parameters of the preparation effect prediction model to be trained according to the training samples, the training preparation effect values corresponding to the training samples, the first predicted preparation effect values corresponding to the training samples, the preset model learning rate and a preset model parameter adjusting algorithm, inputting the training samples to the preparation effect prediction model to be trained aiming at each training sample in the training sample set, and outputting the first predicted preparation effect values corresponding to the training samples.
In implementation, if the average value of the first predicted preparation effect values corresponding to the training samples is greater than the preset target predicted preparation effect value, it is indicated that the preparation effect prediction model to be trained does not meet the requirements. Correspondingly, the computer device may adjust the model parameters of the preparation effect prediction model to be trained according to the training sample, the training preparation effect value corresponding to the training sample, the first prediction preparation effect value corresponding to the training sample, the preset model learning rate, and the preset model parameter adjustment algorithm. Thereafter, the computer device may repeatedly execute step 502 to train the preparation effect prediction model to be trained until the preparation effect prediction model to be trained meets the training requirement.
For example, with the training sample set shown in fig. 6, the model learning rate is 0.001, and the target prediction preparation effect value is 0.00000044. After 2752 training sessions, the average of the first predicted manufacturing effect values was 4.3998734 × 10 -7 Which is less than the target predicted production effectiveness value. The computer device may determine that the training of the preparation effect prediction model to be trained is complete. Correspondingly, fig. 7 is a schematic diagram of a model parameter provided in the embodiment of the present application, and the model parameter in the trained preparation effect prediction model is shown in fig. 7.
Optionally, the specific processing procedure of the computer adjusting the model parameters of the preparation effect prediction model to be trained according to the training sample, the training preparation effect value corresponding to the training sample, the first prediction preparation effect value corresponding to the training sample, the preset model learning rate, and the preset model parameter adjustment algorithm is as follows:
determining a training deviation value according to the average value of training preparation effect values corresponding to training samples and the average value of first prediction preparation effect values corresponding to the training samples.
In implementation, after the computer device obtains the first predicted preparation effect value corresponding to each training sample, the computer device may further determine a training deviation value according to an average value of the training preparation effect values corresponding to each training sample and an average value of the first predicted preparation effect values corresponding to each training sample. Wherein, the formula for determining the training deviation value is formula (8).
Formula (8):
Figure BDA0003229368990000111
where E represents a training bias value, y Average out Mean value, Y, representing the first predicted production Effect value Average out Represents the average of the training preparation effect values.
And step two, determining the partial derivative corresponding to the model parameter according to the training sample, the average value of the training preparation effect values corresponding to the training sample, and the average value and the training deviation value of the first prediction preparation effect values corresponding to the training sample.
In implementation, after the computer device determines the training deviation value, the partial derivative corresponding to the model parameter may be further determined according to the training sample, the average value of the training preparation effect values corresponding to the training sample, the average value of the first prediction preparation effect values corresponding to the training sample, and the training deviation value. For example, the partial derivatives of the respective model parameters calculated based on the above-described equations (1) to (8) are as follows.
Model parameters b:
Figure BDA0003229368990000112
model parameters b1:
Figure BDA0003229368990000113
model parameters b2:
Figure BDA0003229368990000114
model parameters ω 1:
Figure BDA0003229368990000115
model parameter ω 2:
Figure BDA0003229368990000116
model parameter ω 11:
Figure BDA0003229368990000117
model parameter ω 12:
Figure BDA0003229368990000121
model parameter ω 21:
Figure BDA0003229368990000122
model parameter ω 22:
Figure BDA0003229368990000123
model parameters ω 31:
Figure BDA0003229368990000124
model parameter ω 32:
Figure BDA0003229368990000125
it should be noted that the computer device may select the training water pressure (x 1), the training water temperature (x 2) and the training air pressure (x 3) contained in any training sample to determine the partial derivative of each model parameter.
And step three, updating the model parameters of the preparation effect prediction model to be trained according to the model parameters, the partial derivatives corresponding to the model parameters and the model learning rate.
In implementation, after the computer device determines the partial derivatives corresponding to the model parameters, the model parameters of the preparation effect prediction model to be trained may be further updated according to the model parameters, the partial derivatives corresponding to the model parameters, and the model learning rate. Wherein the computer device updates the model parameters of the preparation effect prediction model to be trained based on the formula (9) and the formula (10).
Formula (9):
Figure BDA0003229368990000126
equation (10):
Figure BDA0003229368990000127
wherein, ω is New And b New For updated model parameters, ω Old age And b Old age As the current model parameter, a is the model learning rate.
The embodiment of the application further provides a verification method for preparing the effect prediction model, as shown in fig. 8, the specific processing procedures are as follows:
step 801, a pre-stored verification sample set is obtained. The verification sample set comprises a verification sample and a verification preparation effect value corresponding to the verification sample.
In implementation, the computer device may obtain a pre-stored set of validation samples when a validation of the preparation effect prediction model to be validated is required. The verifying sample set comprises a verifying sample and a verifying preparation effect value corresponding to the verifying sample, and the verifying sample comprises verifying water pressure, verifying water temperature and verifying air pressure. Fig. 9 is a schematic diagram of a verification sample set according to an embodiment of the present application. As shown in fig. 9, the set of verification samples includes 10 verification samples (i.e., sample 1 to sample 10 in fig. 9) and a verification preparation effect value (i.e., y in fig. 9) corresponding to each verification sample, where each verification sample includes a verification water pressure (i.e., x1 in fig. 9), a verification water temperature (i.e., x2 in fig. 9), and a verification air pressure (i.e., x3 in fig. 9).
Step 802, for each verification sample in the verification sample set, inputting the verification sample to a preparation effect prediction model to be verified, and outputting a second predicted preparation effect value corresponding to the verification sample.
In implementation, after the computer device acquires the verification sample set, for each verification sample in the verification sample set, the verification sample may be input to the preparation effect prediction model to be verified. Correspondingly, the preparation effect prediction model to be verified outputs a second predicted preparation effect value corresponding to the verification sample.
In step 803, if the absolute difference between the second predicted preparation effect value corresponding to the verification sample and the verification preparation effect value corresponding to the verification sample is less than or equal to the preset difference threshold, it is determined that the verification sample is the target verification sample.
In implementation, after obtaining the second predicted preparation effect value corresponding to the verification sample, the computer device may further determine whether an absolute difference between the second predicted preparation effect value corresponding to the verification sample and the verification preparation effect value corresponding to the verification sample is less than or equal to a preset difference threshold. If the absolute difference between the second predicted preparation effect value corresponding to the verification sample and the verification preparation effect value corresponding to the verification sample is less than or equal to the preset difference threshold, the computer device may determine the verification sample as a target verification sample. The difference threshold may be set by a technician according to an actual situation, and the embodiment of the present application is not limited.
Step 804, if the ratio of the number of the target verification samples to the total number of the verification samples is greater than or equal to a preset ratio threshold, determining that the preparation effect prediction model to be verified meets the training requirement.
In an implementation, after the computer device determines the target verification samples, it may further determine whether a ratio of the number of the target verification samples to the total number of the verification samples is greater than or equal to a preset ratio threshold. If the ratio of the number of the target verification samples to the total number of the verification samples is greater than or equal to a preset ratio threshold, the computer device may determine that the preparation effect prediction model to be verified meets the training requirement. The ratio threshold may be set by a technician according to an actual situation, and the embodiment of the present application is not limited.
The embodiment of the application provides a preparation method of bubble water. The gas water heater obtains the current water temperature, the current water pressure and a prestored air pressure set. Then, aiming at each air pressure in the air pressure set, the gas water heater inputs the current water temperature, the current water pressure and the air pressure into a preparation effect prediction model trained in advance, and outputs a preparation effect value corresponding to the air pressure. And then, the gas water heater determines the air pressure corresponding to the minimum preparation effect value as the target air pressure, and prepares bubble water according to the current water temperature, the current water pressure and the target air pressure. Thus, the effect of producing bubble water can be improved.
An embodiment of the present application further provides an apparatus for preparing bubble water, as shown in fig. 10, the apparatus includes:
a first obtaining module 1010, configured to obtain a current water temperature, a current water pressure, and a pre-stored air pressure set;
a first output module 1020, configured to input, for each air pressure in the air pressure set, the current water temperature, the current water pressure, and the air pressure to a pre-trained preparation effect prediction model, and output a preparation effect value corresponding to the air pressure;
the first determining module 1030 is configured to determine the air pressure corresponding to the minimum preparation effect value as a target air pressure, and prepare bubble water according to the current water temperature, the current water pressure, and the target air pressure.
As an optional implementation, the apparatus further comprises:
the second acquisition module is used for acquiring a pre-stored training sample set, and the training sample set comprises training samples and training preparation effect values corresponding to the training samples;
the second output module is used for inputting the training sample to a preparation effect prediction model to be trained aiming at each training sample in the training sample set and outputting a first prediction preparation effect value corresponding to the training sample;
the first judgment module is used for judging that the training of the preparation effect prediction model to be trained is finished if the average value of the first prediction preparation effect values is less than or equal to a preset target prediction preparation effect value;
and the second determining module is used for adjusting the model parameters of the preparation effect prediction model to be trained according to the training samples, the training preparation effect values corresponding to the training samples, the first prediction preparation effect values corresponding to the training samples, the preset model learning rate and a preset model parameter adjusting algorithm if the average value of the first prediction preparation effect values is greater than the preset target prediction preparation effect value, triggering the second output module to execute the steps of inputting the training samples to the preparation effect prediction model to be trained aiming at each training sample in the training sample set and outputting the first prediction preparation effect values corresponding to the training samples.
As an optional implementation manner, the second determining module is specifically configured to:
determining a training deviation value according to the average value of the training preparation effect values corresponding to the training samples and the average value of the first prediction preparation effect values corresponding to the training samples;
determining a partial derivative corresponding to the model parameter according to the training sample, the average value of the training preparation effect values corresponding to the training sample, the average value of the first prediction preparation effect values corresponding to the training sample and the training deviation value;
and updating the model parameters of the preparation effect prediction model to be trained according to the model parameters, the partial derivatives corresponding to the model parameters and the model learning rate.
As an optional implementation, the apparatus further comprises:
the third acquisition module is used for acquiring a pre-stored verification sample set, and the verification sample set comprises verification samples and verification preparation effect values corresponding to the verification samples;
the third output module is used for inputting the verification sample to the preparation effect prediction model to be verified aiming at each verification sample in the verification sample set and outputting a second prediction preparation effect value corresponding to the verification sample;
a third determining module, configured to determine that the verification sample is a target verification sample if an absolute difference between a second predicted preparation effect value corresponding to the verification sample and a verification preparation effect value corresponding to the verification sample is less than or equal to a preset difference threshold;
and the second judgment module is used for judging that the preparation effect prediction model to be verified meets the training requirement if the ratio of the number of the target verification samples to the total number of the verification samples is greater than or equal to a preset ratio threshold.
In one embodiment, a computer device is provided, as shown in fig. 11, and includes a memory and a processor, where the memory stores a computer program that can be executed on the processor, and the processor implements the steps of the method for preparing bubble water.
In one embodiment, a computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method for preparing bubble water.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), rambus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
All possible combinations of the technical features in the above embodiments may not be described for the sake of brevity, but should be considered as being within the scope of the present disclosure as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (8)

1. A method of preparing bubble water, the method comprising:
acquiring a current water temperature, a current water pressure and a pre-stored air pressure set;
inputting the current water temperature, the current water pressure and the air pressure into a pre-trained preparation effect prediction model aiming at each air pressure in the air pressure set, and outputting a preparation effect value corresponding to the air pressure;
determining the air pressure corresponding to the minimum preparation effect value as a target air pressure, and preparing bubble water according to the current water temperature, the current water pressure and the target air pressure;
the method further comprises the following steps: acquiring a pre-stored training sample set, wherein the training sample set comprises training samples and training preparation effect values corresponding to the training samples; for each training sample in the training sample set, inputting the training sample into a preparation effect prediction model to be trained, and outputting a first prediction preparation effect value corresponding to the training sample; if the average value of the first predicted preparation effect values is smaller than or equal to a preset target predicted preparation effect value, judging that the preparation effect prediction model to be trained is trained; and if the average value of the first predicted preparation effect values is larger than a preset target predicted preparation effect value, adjusting model parameters of the preparation effect prediction model to be trained according to the training sample, the training preparation effect value corresponding to the training sample, the first predicted preparation effect value corresponding to the training sample, a preset model learning rate and a preset model parameter adjusting algorithm, executing the step of inputting the training sample to the preparation effect prediction model to be trained aiming at each training sample in the training sample set, and outputting the first predicted preparation effect value corresponding to the training sample.
2. The method according to claim 1, wherein the adjusting the model parameters of the preparation effect prediction model to be trained according to the training sample, the training preparation effect value corresponding to the training sample, the first prediction preparation effect value corresponding to the training sample, a preset model learning rate, and a preset model parameter adjusting algorithm comprises:
determining a training deviation value according to the average value of the training preparation effect values corresponding to the training samples and the average value of the first prediction preparation effect values corresponding to the training samples;
determining a partial derivative corresponding to a model parameter according to the training sample, the average value of the training preparation effect values corresponding to the training sample, the average value of the first prediction preparation effect values corresponding to the training sample and the training deviation value;
and updating the model parameters of the preparation effect prediction model to be trained according to the model parameters, the partial derivatives corresponding to the model parameters and the model learning rate.
3. The method of claim 1, further comprising:
acquiring a pre-stored verification sample set, wherein the verification sample set comprises verification samples and verification preparation effect values corresponding to the verification samples;
for each verification sample in the verification sample set, inputting the verification sample into a preparation effect prediction model to be verified, and outputting a second prediction preparation effect value corresponding to the verification sample;
if the absolute difference value between the second predicted preparation effect value corresponding to the verification sample and the verification preparation effect value corresponding to the verification sample is smaller than or equal to a preset difference threshold value, determining that the verification sample is a target verification sample;
and if the ratio of the number of the target verification samples to the total number of the verification samples is greater than or equal to a preset ratio threshold, judging that the preparation effect prediction model to be verified meets the training requirement.
4. An apparatus for preparing bubble water, the apparatus comprising:
the first acquisition module is used for acquiring the current water temperature, the current water pressure and a pre-stored air pressure set;
the first output module is used for inputting the current water temperature, the current water pressure and the air pressure into a pre-trained preparation effect prediction model aiming at each air pressure in the air pressure set and outputting a preparation effect value corresponding to the air pressure;
the first determining module is used for determining the air pressure corresponding to the minimum preparation effect value as a target air pressure and preparing bubble water according to the current water temperature, the current water pressure and the target air pressure;
the device further comprises: the second acquisition module is used for acquiring a pre-stored training sample set, wherein the training sample set comprises training samples and training preparation effect values corresponding to the training samples; the second output module is used for inputting the training sample to a preparation effect prediction model to be trained aiming at each training sample in the training sample set and outputting a first prediction preparation effect value corresponding to the training sample; the first judgment module is used for judging that the training of the preparation effect prediction model to be trained is finished if the average value of the first prediction preparation effect values is less than or equal to a preset target prediction preparation effect value; and a second determining module, configured to, if the average value of the first predicted preparation effect values is greater than a preset target predicted preparation effect value, adjust a model parameter of the to-be-trained preparation effect prediction model according to the training sample, the training preparation effect value corresponding to the training sample, the first predicted preparation effect value corresponding to the training sample, a preset model learning rate, and a preset model parameter adjustment algorithm, trigger the second output module to execute the step of inputting the training sample to the to-be-trained preparation effect prediction model for each training sample in the training sample set, and output the first predicted preparation effect value corresponding to the training sample.
5. The apparatus of claim 4, wherein the second determining module is specifically configured to:
determining a training deviation value according to the average value of the training preparation effect values corresponding to the training samples and the average value of the first prediction preparation effect values corresponding to the training samples;
determining a partial derivative corresponding to a model parameter according to the training sample, the average value of the training preparation effect values corresponding to the training sample, the average value of the first prediction preparation effect values corresponding to the training sample and the training deviation value;
and updating the model parameters of the preparation effect prediction model to be trained according to the model parameters, the partial derivatives corresponding to the model parameters and the model learning rate.
6. The apparatus of claim 4, further comprising:
the third acquisition module is used for acquiring a pre-stored verification sample set, wherein the verification sample set comprises verification samples and verification preparation effect values corresponding to the verification samples;
the third output module is used for inputting each verification sample in the verification sample set to a preparation effect prediction model to be verified and outputting a second prediction preparation effect value corresponding to the verification sample;
a third determining module, configured to determine that the verification sample is a target verification sample if an absolute difference between a second predicted preparation effect value corresponding to the verification sample and a verification preparation effect value corresponding to the verification sample is less than or equal to a preset difference threshold;
and the second judgment module is used for judging that the preparation effect prediction model to be verified meets the training requirement if the ratio of the number of the target verification samples to the total number of the verification samples is greater than or equal to a preset ratio threshold.
7. A computer device comprising a memory and a processor, the memory having stored thereon a computer program operable on the processor, wherein the processor, when executing the computer program, performs the steps of the method of any of claims 1 to 3.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 3.
CN202110999825.9A 2021-08-25 2021-08-25 Bubble water preparation method and device, computer equipment and readable storage medium Active CN113686025B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110999825.9A CN113686025B (en) 2021-08-25 2021-08-25 Bubble water preparation method and device, computer equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110999825.9A CN113686025B (en) 2021-08-25 2021-08-25 Bubble water preparation method and device, computer equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN113686025A CN113686025A (en) 2021-11-23
CN113686025B true CN113686025B (en) 2022-10-14

Family

ID=78583704

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110999825.9A Active CN113686025B (en) 2021-08-25 2021-08-25 Bubble water preparation method and device, computer equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN113686025B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN208108496U (en) * 2018-04-25 2018-11-16 芜湖美的厨卫电器制造有限公司 Water heater
CN211345857U (en) * 2019-12-18 2020-08-25 海绵城市雨水收集利用技术有限公司 Electric water heater with micro-nano bubble generating device
CN112197440A (en) * 2020-10-10 2021-01-08 华帝股份有限公司 Water heater and control method thereof
CN112762619A (en) * 2021-02-04 2021-05-07 华帝股份有限公司 Foaming water generating device and gas water heater with same
CN113091300A (en) * 2021-04-07 2021-07-09 华帝股份有限公司 Gas water heater with bubble water function and control method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN208108496U (en) * 2018-04-25 2018-11-16 芜湖美的厨卫电器制造有限公司 Water heater
CN211345857U (en) * 2019-12-18 2020-08-25 海绵城市雨水收集利用技术有限公司 Electric water heater with micro-nano bubble generating device
CN112197440A (en) * 2020-10-10 2021-01-08 华帝股份有限公司 Water heater and control method thereof
CN112762619A (en) * 2021-02-04 2021-05-07 华帝股份有限公司 Foaming water generating device and gas water heater with same
CN113091300A (en) * 2021-04-07 2021-07-09 华帝股份有限公司 Gas water heater with bubble water function and control method thereof

Also Published As

Publication number Publication date
CN113686025A (en) 2021-11-23

Similar Documents

Publication Publication Date Title
CN106251859B (en) Voice recognition processing method and apparatus
US9803576B2 (en) System and method to predict calibration values based on existing calibrations
CN104882139B (en) The method and apparatus of phonetic synthesis
CN111832294B (en) Method and device for selecting marking data, computer equipment and storage medium
CN108091329A (en) Method, apparatus and computing device based on speech recognition controlled automobile
CN104112445A (en) Terminal and voice identification method
CN113686025B (en) Bubble water preparation method and device, computer equipment and readable storage medium
CN112233651B (en) Dialect type determining method, device, equipment and storage medium
CN105763586A (en) System and method for remotely controlling vehicle
CN114360522B (en) Training method of voice awakening model, and detection method and equipment of voice false awakening
CN111306720B (en) Method and device for setting air conditioner parameters
CN110910328A (en) Defense method based on antagonism sample classification grade
CN106663421A (en) Voice recognition system and voice recognition method
CN111933125B (en) Speech recognition method and device of combined model and computer equipment
CN112966429B (en) WGANs data enhancement-based nonlinear industrial process modeling method
CN109871448B (en) Short text classification method and system
TWI708128B (en) Method and electrical device for adjusting process parameter
JP2019125014A (en) Learning device, learning method, and learning program
JP2005003997A (en) Device and method for speech recognition, and vehicle
CN101807397B (en) Voice detection method of noise robustness based on hidden semi-Markov model
CN107291978B (en) Mechanical simulation method and device for glass fiber material part
CN114062132A (en) Method for predicting initial time of uniaxial compression accelerated creep of rock
CN115018988A (en) Method, device and equipment for generating virtual image video stream and storage medium
CN113705104A (en) Model training method, device, equipment and medium based on composite counterattack
CN114419691A (en) Method for generating human face aging image, model training method, device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant