CN113850348A - Automatic generation method based on perception intensity investigation and evaluation model - Google Patents
Automatic generation method based on perception intensity investigation and evaluation model Download PDFInfo
- Publication number
- CN113850348A CN113850348A CN202111269463.4A CN202111269463A CN113850348A CN 113850348 A CN113850348 A CN 113850348A CN 202111269463 A CN202111269463 A CN 202111269463A CN 113850348 A CN113850348 A CN 113850348A
- Authority
- CN
- China
- Prior art keywords
- picture
- pictures
- research
- data
- perception intensity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000008447 perception Effects 0.000 title claims abstract description 61
- 238000011835 investigation Methods 0.000 title claims abstract description 45
- 238000000034 method Methods 0.000 title claims abstract description 34
- 238000013210 evaluation model Methods 0.000 title claims abstract description 24
- 238000011160 research Methods 0.000 claims abstract description 45
- 238000012549 training Methods 0.000 claims abstract description 21
- 238000013135 deep learning Methods 0.000 claims abstract description 4
- 238000012360 testing method Methods 0.000 claims description 11
- 230000006870 function Effects 0.000 claims description 10
- 101100481876 Danio rerio pbk gene Proteins 0.000 claims description 5
- 101100481878 Mus musculus Pbk gene Proteins 0.000 claims description 5
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000010606 normalization Methods 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 3
- 238000010187 selection method Methods 0.000 claims description 3
- 230000011218 segmentation Effects 0.000 claims description 2
- 238000011156 evaluation Methods 0.000 abstract description 6
- 238000011158 quantitative evaluation Methods 0.000 abstract description 4
- 238000004458 analytical method Methods 0.000 abstract description 3
- 238000000638 solvent extraction Methods 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- 101100153581 Bacillus anthracis topX gene Proteins 0.000 description 2
- 101150041570 TOP1 gene Proteins 0.000 description 2
- 230000000052 comparative effect Effects 0.000 description 2
- 238000007405 data analysis Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 101100261006 Salmonella typhi topB gene Proteins 0.000 description 1
- 101150104012 TOP2 gene Proteins 0.000 description 1
- 101150107801 Top2a gene Proteins 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000004451 qualitative analysis Methods 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 101150032437 top-3 gene Proteins 0.000 description 1
- 230000007306 turnover Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/25—Integrating or interfacing systems involving database management systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Molecular Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Probability & Statistics with Applications (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention relates to an automatic generation method based on a perception intensity investigation and evaluation model, which comprises the following steps: step S1: acquiring a research database; step S2: periodically accessing a research database according to the project ID, inquiring project research data, and evaluating whether the data quality meets the condition of entering an automatic algorithm generation stage; step S3: when the quality of the investigated data meets the condition of entering the automatic algorithm generation stage in the step S2), calculating the perception score of each picture to form picture perception intensity data; step S4: and training the image perception intensity data by using a deep learning network model to obtain an automatic evaluation model aiming at the investigation problem. According to the method, the investigation data is used for automatically generating the perception intensity evaluation algorithm, quantitative evaluation of investigation dimensionality is achieved, investigation can more accurately reflect the concept of a subject, and an analysis basis is provided for application and expansion of later-stage investigation results.
Description
[ technical field ] A method for producing a semiconductor device
The invention relates to a plurality of technical fields of questionnaire investigation and algorithm research and development, in particular to an automatic generation method based on a perception intensity investigation and evaluation model.
[ background of the invention ]
The market scale of the current domestic and foreign market research keeps a higher growth situation, and according to the report of the European market and public opinion research institute (ESOMAR), the business turnover of the Chinese market research industry also keeps a long-term growth situation, and reaches 270.0 hundred million yuan in 2019. Meanwhile, the public service policy research evaluation and data service purchase amount is approximately equal to 6.54 percent of the total purchase amount of the ministerial service classes, about 930 billion yuan is reached in 2019, and the market expansion space is great in the future.
In the prior art, survey data are generally manually input for a result after questionnaire survey, then an excel table is used for counting the survey result, and the survey result is subjected to data analysis in a chart form, but most of the survey results are qualitative analysis, the survey results cannot be automatically evaluated, and an evaluation model cannot be formed for automatically evaluating a new survey object.
Therefore, there is a need to provide a new automatic generation method based on the perceptual strength research and evaluation model to solve the above technical problems.
[ summary of the invention ]
The invention mainly aims to provide an automatic generation method based on a perception intensity investigation and evaluation model, which realizes automatic analysis and automatic evaluation of investigation data, forms an automatic evaluation model aiming at set investigation problems, can be further applied to automatic evaluation and scoring of the investigation problems and realizes quantitative evaluation.
The invention realizes the purpose through the following technical scheme: an automatic generation method based on a perception intensity investigation and evaluation model comprises the following steps:
step S1: acquiring a research database, wherein research data in the research database comprises project IDs, research problems, a research picture set and the comparison times of each picture in the research picture set;
step S2: periodically accessing a research database according to the project ID, inquiring project research data, and evaluating whether the data quality meets the condition of entering an automatic algorithm generation stage;
step S3: when the quality of the investigation data meets the condition of entering the automatic algorithm generation stage in the step S2), calculating the perception intensity score of each picture, representing the perception intensity of the picture on a specific dimension related to the investigation problem, and forming picture perception intensity data;
step S4: and training the image perception intensity data by using a deep learning network model to obtain an automatic evaluation model aiming at the investigation problem.
Further, the method for forming the research database in step S1 includes:
s11) obtaining the investigation question of the questionnaire to be generated;
s12) acquiring a research picture set by using a web crawler method;
s13) generating a research questionnaire by utilizing the research questions in the step S11) and the research picture set in the step S12), presenting two pictures for selection by a user in an alternative mode, and finally completing research data acquisition work to form a research database.
Further, the conditions of entering into the automatic algorithm generation stage in step S2) are: the investigation of the pictures is centralized, and the comparison times of each picture with other different pictures are counted, wherein the average comparison times of all the pictures is required to be more than a set time, generally more than 10 times, such as 10 times, 15 times or 20 times.
Further, the method for calculating the sensing intensity in step S3) includes:
s31) calculating the probability P of each picture being selectediAnd probability N of picture not being selectedi;
For pictures i, piNumber of times the picture was selected in the alternative comparison, niNumber of times the picture was not selected in the comparison, eiThe times that the picture can not be selected in the comparison are indicated;
s32) according to the picture selection probability PiAnd probability N of picture not being selectediComputing a perceptual strength score Q of a picturei:
Wherein,
wherein k is1Number of times picture i was selected in the comparison, k2The number of times picture i was not selected in the comparison. In addition, by adding constant 1 and multiplying by constantThe value range of Q-score is 0-10.
Further, the step S4) includes the following steps:
s41) dividing the perception intensity score Q-score (0-10 points) obtained in the step S3) into segments and counting the number of pictures;
s42) dividing the pictures and the perception intensity scores thereof in each section into a training data set and a testing data set according to a set proportion; for example, randomly selecting 80% of the training data set and 20% of the training data set as the test data set;
s43) using a DenseNet network to train the training data set pictures in batches; specifically, firstly, establishing a picture perception intensity classification network structure, putting pictures and corresponding perception intensity scores into network training, and calculating a network loss function value through a mean square error formula; feeding back forward through an optimizer, carrying out network adjustment, and recalculating a loss function value to minimize the loss function gradually and storing a model with the minimum loss function;
s44) calculating the topk accuracy of the model;
s45) repeating the steps S43) -S44), adjusting learning rate, loss function and batch size parameters until the topk accuracy rate in the step S44) meets a set value, and obtaining a trained model; for example, the topk accuracy satisfies top1> 50% and top2> 85% (5 classification), or top1> 30% and top3> 80% (10 classification);
s46) inputting the test data set picture in the step S42) into the model trained in the step S45), and outputting a picture prediction perception strength score.
Further, the step S41) includes:
s411) normalization processing is carried out on the perception intensity score data:
z=(Q-u)/s
wherein z is a normalized numerical value, u is an overall data mean, s is an overall data variance, and Q is a perceptual strength score;
s412) uniformly dividing all normalized perception intensity scores into M sections according to the size sequence, counting the number of pictures in each section, and calculating the mean value of the perception intensity scores in each section; for example, taking integers 1, 2, 3, 4, 5, 6, 7, 8 and 9 as intermediate break points, and dividing all the perception intensity scores into 10 segments; m can be 15, 20, 30 and the like, and can be flexibly set according to the data sample size;
s413) if the number of the pictures with the perception intensity scores smaller than 1 and the number of the pictures with the perception intensity scores larger than 9 are both larger than 10% of the total number of the pictures, determining that M segments of segmentation are adopted;
s414) if the number of pictures with perceptual strength scores less than 1 or the number of pictures with perceptual strength scores greater than 9 is less than 10% of the total number of pictures, uniformly subdividing the perceptual strength scores into M/2 sections according to the size order, re-counting the number of pictures in each section, and calculating the mean value of the perceptual strength scores in each section, for example, subdividing all the perceptual strength scores into 5 sections at four places of integers 2, 4, 6, and 8.
Further, the step S46) performs inner product on each classification probability result calculated in the DenseNet network and the mean value of the corresponding section in the step S412) or the step S414), so as to obtain the predicted perceptual strength score of the picture in the test data set. In the step, the inner product of each classification probability result calculated by the model and the mean value of the corresponding section is obtained, the average weighted value is used as the prediction perception strength score, and each perception strength section is considered, so that extreme conditions are effectively avoided, and the prediction score is more scientific and reasonable.
Compared with the prior art, the automatic generation method based on the perception intensity investigation and evaluation model has the beneficial effects that: automatically generating a perception intensity evaluation algorithm by utilizing research data obtained by surveying questionnaires based on a forced selection method of one out of two pictures, and realizing quantitative evaluation of research dimensions; on one hand, a standardized and automatic flow is formed, the later operation of research data analysis is simplified, on the other hand, a quantitative calculation model of the perception intensity is built, so that the research can more accurately reflect the concept of a subject, and an analysis basis is provided for the application and the expansion of later research results. Specifically, the method comprises the following steps:
1) carrying out investigation based on a two-out-one forced selection method to obtain an investigation database aiming at a specific investigation problem, obtaining perception intensity data by using the comparison times of investigation pictures and each investigation picture in the investigation database in the investigation process and combining the perception intensity calculation of each investigation picture, and finally obtaining an automatic evaluation model aiming at the investigation problem based on a deep learning network by using the perception intensity data as the input of model training to realize the quantitative evaluation of investigation dimensionality;
2) after the investigation picture set in the investigation data is obtained, the picture comparison times are limited to be used as a judgment basis for judging whether the pictures need to enter the subsequent model training, the pictures with more comparison times are reserved, the pictures with less comparison times and without training value are deleted, the effectiveness of model training is greatly improved, and the accuracy of the subsequent evaluation model is further improved;
3) after the perception intensity data are obtained, the data are subjected to normalization processing and are divided into sections, then the data are divided into 10 groups according to the perception intensity scores and the corresponding picture number, the influence of the perception intensity mean distribution on the accuracy rate of an evaluation model is eliminated, and a quantitative reference basis is provided for the scoring of subsequent test pictures.
[ description of the drawings ]
FIG. 1 is a flow chart of a method of an embodiment of the present invention;
FIG. 2 is a frequency histogram of comparative frequency distribution according to an embodiment of the present invention;
FIG. 3 is a perceptual intensity score distribution frequency histogram of an embodiment of the present invention;
FIG. 4 is a frequency histogram of the distribution of the perceived intensity scores of 2 segments, 3 segments, 5 segments, and 10 segments of a road drainage outlet according to an embodiment of the present invention;
FIG. 5 is a frequency histogram of the distribution of perceptual intensity scores for 2, 3, 5 and 10 segments of a road rail according to an embodiment of the present invention;
FIG. 6 is a frequency histogram of the distribution of perceptual intensity scores of 2 segments, 3 segments, 5 segments, and 10 segments of the manhole cover according to an embodiment of the present invention;
fig. 7 is a frequency histogram of the distribution of the perceptual intensity scores of 2, 3, 5 and 10 segments of the road surface according to the embodiment of the present invention.
[ detailed description ] embodiments
The first embodiment is as follows:
referring to fig. 1, the present embodiment is an automatic generation method based on a perceptual strength investigation and evaluation model, which includes the following steps:
step S1: obtain the investigation database, 4 investigation projects have been obtained in this embodiment, and its investigation problem is "which road surface is good? "," which road well is covered? "," which road has a good drain? "," which road guardrail is good? ", the comparison frequency data of the corresponding survey picture set and the corresponding picture is shown in table 1.
TABLE 1
Picture set (Unit: sheet) | Total number of comparisons (unit: times) | |
Road pavement | 5117 | 51309 |
Road well cover | 1022 | 10328 |
Road water outlet | 1002 | 8640 |
Road railing | 721 | 7233 |
Step S2: and (6) clearing data. The pictures with the number of comparison times less than 10 times are deleted, and processed picture data are obtained, as shown in table 2.
TABLE 2
The contrast times corresponding to the effective pictures are represented by a histogram, as shown in fig. 2, where the abscissa is the contrast times and the ordinate is the number of the corresponding pictures.
Step S3: the perception intensity score of each picture is calculated, the perception intensity score is used as an abscissa, the number of the corresponding pictures under the perception intensity score is used as an ordinate, and a histogram is generated as shown in fig. 3.
Step S4: normalizing all the perception intensity scores, equally dividing the picture into 10 sections according to the highest score and the lowest score, and recording the mean value in each section; the result can be better and more sufficiently mapped to the (1, 10) fractional interval by the division of 10 sections, the more detailed the data division is, the better the real reality can be reflected, the embodiment is better if the conditional division is 20 sections because of the limitation of the data volume.
Then, the perception intensity scores are used as horizontal coordinates, a single section is used as a statistical unit, the number of pictures in the corresponding perception intensity score section is used as a vertical coordinate, and a 10-section classification histogram is generated. In order to verify that the more the classification categories of the sections are, the better, the present embodiment divides the perceptual strength scores into 2 sections, 3 sections and 5 sections, respectively, as comparative examples, and simultaneously generates corresponding classification histograms, as shown in fig. 4 to 7.
Step S5: the picture in step S4) is divided into sections according to the ratio of 8:2, to yield 80% training data set and 20% testing data set. The data set partitioning is shown in table 3.
TABLE 3
Effective picture | Training set | Ratio of occupation of | Test set | Ratio of occupation of | |
Road pavement | 5009 | 4259 | 85.03% | 750 | 14.97% |
Road well cover | 1001 | 851 | 85.01% | 150 | 14.99% |
Road water outlet | 955 | 801 | 83.87% | 150 | 15.71% |
Road railing | 717 | 567 | 79.08% | 150 | 20.92% |
Step S6: inputting the training data set in the step S5) into a DenseNet for training, verifying the network by using a test data set, and training to obtain an automatic evaluation model based on set investigation problems.
In order to verify that the accuracy of the algorithm is affected by the number of the data set divided into the sections, the present embodiment calculates the accuracy of the model obtained by training the data obtained by dividing the perception intensity score into the 2 sections, the 3 sections, the 5 sections and the 10 sections through the DenseNet network, as shown in table 4, table 4 shows the accuracy of the road condition scoring algorithm with different section numbers and top-k and the improvement of the accuracy of the road condition scoring algorithm above the random selection probability.
TABLE 4
It can be seen from table 4 that the fewer the groupings, the higher the accuracy of the algorithm, and the more the groupings, the greater the magnitude of the increase in the accuracy of the algorithm above the random selection probability. the choice of top-k also affects the accuracy of the algorithm. An 8:2 data set partitioning ratio and a 5 segment partitioning amount of data usage are determined, taking into account the balance between group number/top-k selection and precision.
The accuracy results of the final trained automated assessment model based on the specific investigation problem are shown in table 5.
TABLE 5
Rate of accuracy | Accuracy Top2 | |
Road pavement | 68.0% | 81.4% |
Road well cover | 64.3% | 94.7% |
Road water outlet | 56.0% | 91.3% |
Road railing | 50.1% | 80.0% |
As can be seen from table 5:
1) because the number of the pictures of the investigation picture set is large, the number of comparison times between every two pictures is large, the accuracy of the automatic evaluation model for the investigation problem of good road surface of the road obtained by the method of the embodiment on the road surface scoring judgment reaches 68.0%, and the accuracy cannot be further improved mainly for keeping the generalization of the algorithm, namely ensuring that the single algorithm can be applied to different types of roads.
2) The manhole cover and the road drain port have substantially the same number of pictures and the same number of comparisons, but have greatly different performances. One main reason may be that manhole covers are easier to compare and do show differences in their condition. However, more errors are found in the detection of the road drainage openings, their images are too blurred and similar to be compared, resulting in a relatively low accuracy of the algorithm.
3) The accuracy rate of road guardrail evaluation reaches 50.1%, mainly because the number of investigation pictures and the number of comparison times are less, the figure is easier to fit.
The automatic evaluation model for setting the research question can be applied after being obtained, and when the automatic evaluation model is applied, the automatic evaluation model can score a picture by inputting the picture, for example, the picture can be scored about street road elements by the automatic evaluation model of 'which street road is good'.
What has been described above are merely some embodiments of the present invention. It will be apparent to those skilled in the art that various changes and modifications can be made without departing from the inventive concept thereof, and these changes and modifications can be made without departing from the spirit and scope of the invention.
Claims (8)
1. An automatic generation method based on a perception intensity investigation and evaluation model is characterized by comprising the following steps: which comprises the following steps:
step S1: acquiring a research database, wherein research data in the research database comprises project IDs, research problems, a research picture set and the comparison times of each picture in the research picture set;
step S2: periodically accessing a research database according to the project ID, inquiring project research data, and evaluating whether the data quality meets the condition of entering an automatic algorithm generation stage;
step S3: when the quality of the investigated data meets the condition of entering the automatic algorithm generation stage in the step S2), calculating the perception intensity fraction of each picture to form picture perception intensity data;
step S4: and training the image perception intensity data by using a deep learning network model to obtain an automatic evaluation model aiming at the investigation problem.
2. The method of claim 1, wherein the method comprises: the method for forming the research database in the step S1) comprises the following steps:
s11) obtaining the investigation question of the questionnaire to be generated;
s12) acquiring a research picture set by using a web crawler method;
s13) generating a research questionnaire by utilizing the research questions in the step S11) and the research picture set in the step S12), presenting two pictures for a user to select each time by adopting a two-out-one forced selection method, and finally finishing the research data acquisition work to form a research database.
3. The method of claim 1, wherein the method comprises: the conditions for entering into the automatic algorithm generation stage in the step S2) are as follows: and in the research picture set, counting the comparison times of each picture with other different pictures, and requiring that the average comparison times of all the pictures is greater than the set times.
4. The method of claim 1, wherein the method comprises: the calculation method of the perceived intensity in the step S3) includes:
s31) calculating the probability P of each picture being selectediAnd probability N of picture not being selectedi;
For pictures i, piNumber of times the picture was selected in the alternative comparison, niNumber of times the picture was not selected in the comparison, eiThe times that the picture can not be selected in the comparison are indicated;
s32) according to the picture selection probability PiAnd probability N of picture not being selectediCalculation chartPerceived intensity score of patch Qi:
Wherein,
wherein k is1Number of times picture i was selected in the comparison, k2The number of times picture i was not selected in the comparison.
5. The method of claim 4, wherein the method comprises: the step S4) includes the steps of:
s41) segmenting the perceptual strength score obtained in the step S3) into sections to count the number of pictures;
s42) dividing the pictures and the perception intensity scores thereof in each section into a training data set and a testing data set according to a set proportion;
s43) using a DenseNet network to train the training data set pictures in batches;
s44) calculating the topk accuracy of the model;
s45) repeating the steps S43) -S44), adjusting learning rate, loss function and batch size parameters until the topk accuracy rate in the step S44) meets a set value, and obtaining a trained model;
s46) inputting the test data set picture in the step S42) into the model trained in the step S45), and outputting a picture prediction perception strength score.
6. The method of claim 5, wherein the method comprises: the step S43) includes: firstly, establishing a picture perception intensity classification network structure, putting pictures and corresponding perception intensity scores into network training, and calculating a network loss function value through a mean square error formula; and feeding back forward through an optimizer, carrying out network adjustment, and recalculating a loss function value to minimize the loss function gradually and storing a model with the minimum loss function.
7. The method of claim 5, wherein the method comprises: the step S41) includes:
s411) normalization processing is carried out on the perception intensity score data:
z=(Q-u)/s
wherein z is a normalized numerical value, u is an overall data mean, s is an overall data variance, and Q is a perceptual strength score;
s412) uniformly dividing all normalized perception intensity scores into M sections according to the size sequence, counting the number of pictures in each section, and calculating the mean value of the perception intensity scores in each section;
s413) if the number of the pictures with the perception intensity scores smaller than 1 and the number of the pictures with the perception intensity scores larger than 9 are both larger than 10% of the total number of the pictures, determining that M segments of segmentation are adopted;
s414) if the number of the pictures with the perception intensity scores smaller than 1 or the number of the pictures with the perception intensity scores larger than 9 is smaller than 10% of the total number of the pictures, uniformly subdividing the perception intensity scores into M/2 sections according to the size sequence, counting the number of the pictures in each section again, and calculating the average value of the perception intensity scores in each section.
8. The method of claim 7, wherein the method comprises: the step S46) includes: and performing inner product on each classification probability result calculated in the DenseNet and the mean value of the corresponding section in the step S412) or the step S414) to obtain the prediction perception intensity score of the picture in the test data set.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111269463.4A CN113850348A (en) | 2021-10-29 | 2021-10-29 | Automatic generation method based on perception intensity investigation and evaluation model |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111269463.4A CN113850348A (en) | 2021-10-29 | 2021-10-29 | Automatic generation method based on perception intensity investigation and evaluation model |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113850348A true CN113850348A (en) | 2021-12-28 |
Family
ID=78983364
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111269463.4A Pending CN113850348A (en) | 2021-10-29 | 2021-10-29 | Automatic generation method based on perception intensity investigation and evaluation model |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113850348A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116629697A (en) * | 2023-06-07 | 2023-08-22 | 河南省科学院地理研究所 | Urban energy ecological evaluation method, system, terminal and storage medium |
-
2021
- 2021-10-29 CN CN202111269463.4A patent/CN113850348A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116629697A (en) * | 2023-06-07 | 2023-08-22 | 河南省科学院地理研究所 | Urban energy ecological evaluation method, system, terminal and storage medium |
CN116629697B (en) * | 2023-06-07 | 2024-03-12 | 河南省科学院地理研究所 | Urban energy ecological evaluation method, system, terminal and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102938071B (en) | Fuzzy clustering analysis method for detecting synthetic aperture radar (SAR) image changes based on non-local means | |
Foley | The Socio-Economic Status of Vocational Education and Training Students in Australia. | |
CN104700099A (en) | Method and device for recognizing traffic signs | |
CN108132964A (en) | A kind of collaborative filtering method to be scored based on user item class | |
CN103164687A (en) | Method and system for detecting pornographic image | |
CN111784017A (en) | Road condition factor regression analysis-based road traffic accident quantity prediction method | |
CN113378905B (en) | Small target detection method based on distribution distance | |
CN113850348A (en) | Automatic generation method based on perception intensity investigation and evaluation model | |
CN102073867A (en) | Sorting method and device for remote sensing images | |
CN110334482B (en) | Post-earthquake debris flow dynamic sensitivity evaluation method based on source activity intensity | |
CN106960433B (en) | It is a kind of that sonar image quality assessment method is referred to based on image entropy and the complete of edge | |
US20090259436A1 (en) | Methods and apparatus to determine board exposure levels | |
JP4050273B2 (en) | Classification apparatus and classification method | |
CN113919932A (en) | Client scoring deviation detection method based on loan application scoring model | |
US20160283821A1 (en) | Image processing method and system for extracting distorted circular image elements | |
CN117152092B (en) | Full-reference image evaluation method, device, electronic equipment and computer storage medium | |
CN109636184A (en) | A kind of appraisal procedure and system of the account assets of brand | |
CN101299242A (en) | Method and device for determining threshold value in human body skin tone detection | |
CN105678333A (en) | Congested area determining method and device | |
Gadiraju et al. | Who Gets Broadband When? A Panel Data Analysis of Demographic, Economic and Technological Factors Explaining US Broadband Deployment | |
Schwanck et al. | Measuring lesion attributes and analysing their spatial patterns at the leaf scale using digital image analysis | |
CN115809784A (en) | Multi-target-based quota adjustment optimal allocation method, device, equipment and medium | |
CN116091200A (en) | Scene credit granting system and method based on machine learning, electronic equipment and medium | |
CN112488749A (en) | Business district information evaluation method | |
CN112750023A (en) | Consumption financial user income estimation method based on factor analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |