CN108647785A - A kind of neural network method for automatic modeling, device and storage medium - Google Patents

A kind of neural network method for automatic modeling, device and storage medium Download PDF

Info

Publication number
CN108647785A
CN108647785A CN201810475514.0A CN201810475514A CN108647785A CN 108647785 A CN108647785 A CN 108647785A CN 201810475514 A CN201810475514 A CN 201810475514A CN 108647785 A CN108647785 A CN 108647785A
Authority
CN
China
Prior art keywords
neural network
network model
text
model
training sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810475514.0A
Other languages
Chinese (zh)
Inventor
田兴邦
杨喆
何国涛
李全忠
蒲瑶
许士亭
穆仕伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Puqiang times (Zhuhai Hengqin) Information Technology Co., Ltd
Original Assignee
Universal Information Technology (beijing) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Universal Information Technology (beijing) Co Ltd filed Critical Universal Information Technology (beijing) Co Ltd
Priority to CN201810475514.0A priority Critical patent/CN108647785A/en
Publication of CN108647785A publication Critical patent/CN108647785A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

An embodiment of the present invention provides a kind of neural network method for automatic modeling, including:Emulation preliminary date is obtained, is emulated according to the emulation preliminary date and generates model training sample;Neural network model is trained using the model training sample based on network characteristic, adjusts and configures the parameter of the neural network model, obtain available neural network model;Wherein, it adjusts and the parameter for configuring the neural network model is automatically performed.The embodiment of the present invention additionally provides a kind of active interactive device and non-transient readable storage medium storing program for executing, for realizing the method.The present invention can make general business personnel also can efficiently establish desired neural network model.

Description

A kind of neural network method for automatic modeling, device and storage medium
Technical field
The present embodiments relate to depth learning technology field more particularly to a kind of neural network method for automatic modeling, dresses It sets and storage medium.
Background technology
The training of neural network is a complicated difficult process, including data collection, is arranged, mark, network training, survey Examination, verification and deployment.And the defects of makeover process repeatedly is usually needed, take and is not easy to obtain excellent result.It is logical Professional big data engineer is often needed to do operation training pattern.This is not applicable to many scenes, and many situations are to read Access cannot be discharged into big data engineer operation of the external platform to profession according to needing by control.And due to artificial intelligence The breakthrough development of energy, big data engineer's shortage of professionals.Many enterprises do not have big data engineer.And many scenes In model be all non-universal model and with the adjustment of business or the appearance of new scene, be required for building out new model rapidly.Cause This, the method that finding can allow the general business personnel of no big data professional training to carry out High Efficiency Modeling, which just becomes, allows nerve net Network can be spread to where the critical issue of every profession and trade scene.
Invention content
In view of the above-mentioned problems existing in the prior art, an embodiment of the present invention provides a kind of neural network automatic modeling sides Method, device and storage medium.
On the one hand, an embodiment of the present invention provides a kind of neural network method for automatic modeling, including:Obtain emulation preparation number According to being emulated according to the emulation preliminary date and generate model training sample;It is instructed using the model based on network characteristic Practice sample to be trained neural network model, adjusts and configure the parameter of the neural network model, obtain available nerve Network model;Wherein, it adjusts and the parameter for configuring the neural network model is automatically performed.
On the other hand, an embodiment of the present invention provides a kind of active interactive device and a kind of non-transient readable storage medium storing program for executing. A kind of active interactive device includes:At least one processor;And what is connect with the processor communication at least one deposits Reservoir, wherein:The memory is stored with the program instruction that can be executed by the processor, and the processor calls described program Instruction is able to carry out a kind of neural network method for automatic modeling.A kind of non-transient readable storage medium storing program for executing storage program refers to It enables, for executing a kind of neural network method for automatic modeling.
An embodiment of the present invention provides a kind of neural network method for automatic modeling, device and storage mediums, by independently setting The automatic modeling flow of meter to neural network model carry out automatic modeling, it is possible to prevente effectively from it is professional to big data business according to Rely so that general business personnel also can efficiently establish desired neural network model.
Description of the drawings
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below There is attached drawing needed in technology description to be briefly described, it should be apparent that, the accompanying drawings in the following description is this hair Some bright embodiments for those of ordinary skill in the art without creative efforts, can be with root Other attached drawings are obtained according to these attached drawings.
Fig. 1 is the overall flow figure of neural network method for automatic modeling in first embodiment of the invention;
Fig. 2 is the overall flow figure of neural network method for automatic modeling in second embodiment of the invention;
Fig. 3 is the overall flow figure of neural network method for automatic modeling in third embodiment of the invention;
Fig. 4 is the overall flow figure of neural network method for automatic modeling in fourth embodiment of the invention;
Fig. 5 is called together using the neural network model identification after the modeling of neural network method for automatic modeling in the embodiment of the present invention Return rate effect diagram;
Fig. 6 is accurate using the neural network model identification after the modeling of neural network method for automatic modeling in the embodiment of the present invention True rate effect diagram;
Fig. 7 is the hardware device operating diagram of the embodiment of the present invention.
Specific implementation mode
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is A part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art The every other embodiment obtained without making creative work, shall fall within the protection scope of the present invention.
An embodiment of the present invention provides a kind of neural network method for automatic modeling, device and storage mediums.Referring to Fig. 1, Fig. 1 It is the overall flow figure of neural network method for automatic modeling in first embodiment of the invention, including:
S101:Emulation preliminary date is obtained, is emulated according to the emulation preliminary date and generates model training sample.
S102:Neural network model is trained using the model training sample based on network characteristic, adjusts and matches The parameter for setting the neural network model obtains available neural network model.Wherein, it adjusts and configures the neural network mould The parameter of type is automatically performed.
It is the overall flow figure of neural network method for automatic modeling in second embodiment of the invention referring to Fig. 2, Fig. 2, including:
S201:Text set is obtained, the characteristic sentence in the text set is marked, then extracts the neutrality in the text set Text.Wherein, the characteristic sentence includes positive qualities and/or reverse side characteristic, and the neutrality text is to be picked in the text set Except the remaining text after the text for marking characteristic sentence.In another embodiment, the acquisition text set, including:It determines and searches Classification (in other embodiments, the search classification is described by short sentence), keyword, root are determined according to the search classification Text set is determined according to the keyword.In another embodiment, the acquisition text set, including:Directly choose text set.
S202:The text and neutral text that extract characteristic sentence are emulated and generate model training sample.
S203:Neural network model is trained using the model training sample based on network characteristic, adjusts and matches The parameter for setting the neural network model obtains available neural network model.Wherein, it adjusts and configures the neural network mould The parameter of type is automatically performed.
It is the overall flow figure of neural network method for automatic modeling in third embodiment of the invention referring to Fig. 3, Fig. 3, including:
S301:Emulation preliminary date is obtained, is emulated according to the emulation preliminary date and generates model training sample.
S302:Based on Attention, term vector, LSTM networks and decision-making level, using the model training sample to nerve Network model is trained, and adjusts and configure the parameter of the neural network model, obtains available neural network model.Its In, it adjusts and the parameter for configuring the neural network model is automatically performed.
It is the overall flow figure of neural network method for automatic modeling in fourth embodiment of the invention referring to Fig. 4, Fig. 4, including:
S401:Emulation preliminary date is obtained, is emulated according to the emulation preliminary date and generates model training sample.
S402:Neural network model is trained using the model training sample based on network characteristic, adjusts and matches The parameter for setting the neural network model is trained the neural network model using not used model training sample, Until all model training sample standard deviations be used for train the neural network model, obtain available neural network model.Wherein, It adjusts and the parameter for configuring the neural network model is automatically performed.
It is the neural network mould after being modeled using neural network method for automatic modeling in the embodiment of the present invention referring to Fig. 5, Fig. 5 Type identifies recall rate effect diagram, including:
There is the vehicle personage present invention to model recall rate 501 and there is vehicle personage's logical formula to model recall rate 502.It can be seen that There is the vehicle personage present invention to model recall rate 501 and has been apparently higher than vehicle personage's logical formula modeling recall rate 502, i.e. technology of the invention Scheme has the advantage in terms of recall ratio (i.e. recall rate).In addition, buying accident insurance, there are social security personage, married person to cultivate talent Scholar, business personage have room personage, the recall rate situation for equal texts of often going on business and the recall rate situation class for having vehicle personage's text Seemingly, details are not described herein.
It is the neural network mould after being modeled using neural network method for automatic modeling in the embodiment of the present invention referring to Fig. 6, Fig. 6 Type recognition accuracy effect diagram, including:
The scholar present invention that cultivated talent models accuracy rate 601 and scholar's logical formula of having cultivated talent models accuracy rate 602.It can be seen that The scholar present invention that cultivated talent models accuracy rate 601 and is apparently higher than cultivated talent scholar's logical formula modeling accuracy rate 602, i.e. technology of the invention Scheme has the advantage in terms of recognition accuracy.In addition, the present invention modeling accuracy rate for removing room personage's text, which is less than, room Outside the logical formula modeling accuracy rate of personage's text, remaining bought accident insurance, had social security personage, married person, cultivated talent scholar, business The accuracy rate rate situation of personage, equal texts of often going on business are similar with there is the accuracy rate rate situation of vehicle personage's text, no longer superfluous herein It states.
It is the hardware device operating diagram of the embodiment of the present invention referring to Fig. 7, Fig. 7, the hardware device includes:A kind of god Through network automatic modeling device 701, processor 702 and storage medium 703.
Neural network automatic modeling device 701:A kind of neural network automatic modeling device 701 realizes a kind of god Through network method for automatic modeling.
Processor 702:The processor 702 loads and executes the instruction in the storage medium 703 and data for real A kind of existing neural network method for automatic modeling.
Storage medium 703:703 store instruction of the storage medium and data;The storage medium 703 is for realizing described A kind of neural network method for automatic modeling.
Through the above description of the embodiments, those skilled in the art can be understood that each embodiment can It is realized by the mode of software plus required general hardware platform, naturally it is also possible to pass through hardware.Based on this understanding, on Stating technical solution, substantially the part that contributes to existing technology can be expressed in the form of software products in other words, should Computer software product can store in a computer-readable storage medium, such as ROM/RAM, magnetic disc, CD, including several fingers It enables and using so that a computer installation (can be personal computer, server or network equipment etc.) executes each implementation Method described in certain parts of example or embodiment.
Finally it should be noted that:The above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although Present invention has been described in detail with reference to the aforementioned embodiments, it will be understood by those of ordinary skill in the art that:It still may be used With technical scheme described in the above embodiments is modified or equivalent replacement of some of the technical features; And these modifications or replacements, various embodiments of the present invention technical solution that it does not separate the essence of the corresponding technical solution spirit and Range.

Claims (10)

1. a kind of neural network method for automatic modeling, which is characterized in that including:
Emulation preliminary date is obtained, is emulated according to the emulation preliminary date and generates model training sample;
Neural network model is trained using the model training sample based on network characteristic, adjusts and configures the nerve The parameter of network model obtains available neural network model;
Wherein, it adjusts and the parameter for configuring the neural network model is automatically performed.
2. according to the method described in claim 1, it is characterized in that, the acquisition emulate preliminary date, including:
Text set is obtained, the characteristic sentence in the text set is marked, then extracts the neutral text in the text set;
Wherein, the characteristic sentence includes positive qualities and/or reverse side characteristic, and the neutrality text is to be rejected in the text set Mark the remaining text after the text of characteristic sentence.
3. according to the method described in claim 2, it is characterized in that, the acquisition text set, including:
It determines and searches classification, keyword is determined according to the search classification, text set is determined according to the keyword.
4. according to the method described in claim 2, it is characterized in that, the acquisition text set, including:
Directly choose text set.
5. according to the method described in claim 3, it is characterized in that, the search classification is described by short sentence.
6. according to the method described in claim 1, it is characterized in that, the network characteristic includes:
Attention, term vector, LSTM networks and decision-making level.
7. according to the method described in claim 1, it is characterized in that, using the model training sample based on network characteristic described This is trained neural network model, after adjusting and configuring the parameter of the neural network model, it is described obtain it is available Before neural network model, further include:
The neural network model is trained using not used model training sample, until all model training sample standard deviations It is used to train the neural network model.
If 8. according to the method described in claim 2, it is characterized in that, in the text set exist fail identification text, It is trained to obtain final available nerve net using the text for failing identification as training sample input neural network model Network model.
9. a kind of active interactive device, which is characterized in that including:
At least one processor;And
At least one processor being connect with the processor communication, wherein:
The memory is stored with the program instruction that can be executed by the processor, and the processor calls described program to instruct energy Enough methods executed as described in claim 1 to 8 any claim.
10. a kind of non-transient readable storage medium storing program for executing, which is characterized in that the non-transient readable storage medium storing program for executing stores program instruction, Described program instruction is for executing the method as described in claim 1 to 8 any claim.
CN201810475514.0A 2018-05-17 2018-05-17 A kind of neural network method for automatic modeling, device and storage medium Pending CN108647785A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810475514.0A CN108647785A (en) 2018-05-17 2018-05-17 A kind of neural network method for automatic modeling, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810475514.0A CN108647785A (en) 2018-05-17 2018-05-17 A kind of neural network method for automatic modeling, device and storage medium

Publications (1)

Publication Number Publication Date
CN108647785A true CN108647785A (en) 2018-10-12

Family

ID=63756717

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810475514.0A Pending CN108647785A (en) 2018-05-17 2018-05-17 A kind of neural network method for automatic modeling, device and storage medium

Country Status (1)

Country Link
CN (1) CN108647785A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021227293A1 (en) * 2020-05-09 2021-11-18 烽火通信科技股份有限公司 Universal training method and system for artificial intelligence models

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102968410A (en) * 2012-12-04 2013-03-13 江南大学 Text classification method based on RBF (Radial Basis Function) neural network algorithm and semantic feature selection
CN104965819A (en) * 2015-07-12 2015-10-07 大连理工大学 Biomedical event trigger word identification method based on syntactic word vector
CN106407211A (en) * 2015-07-30 2017-02-15 富士通株式会社 Method and device for classifying semantic relationships among entity words
CN107092596A (en) * 2017-04-24 2017-08-25 重庆邮电大学 Text emotion analysis method based on attention CNNs and CCR
CN107102989A (en) * 2017-05-24 2017-08-29 南京大学 A kind of entity disambiguation method based on term vector, convolutional neural networks
CN107229684A (en) * 2017-05-11 2017-10-03 合肥美的智能科技有限公司 Statement classification method, system, electronic equipment, refrigerator and storage medium
CN107239443A (en) * 2017-05-09 2017-10-10 清华大学 The training method and server of a kind of term vector learning model
CN107291795A (en) * 2017-05-03 2017-10-24 华南理工大学 A kind of dynamic word insertion of combination and the file classification method of part-of-speech tagging
CN107656921A (en) * 2017-10-10 2018-02-02 上海数眼科技发展有限公司 A kind of short text dependency analysis method based on deep learning
CN107992941A (en) * 2017-12-28 2018-05-04 武汉璞华大数据技术有限公司 A kind of contract terms sorting technique
CN108030488A (en) * 2017-11-30 2018-05-15 北京医拍智能科技有限公司 The detecting system of arrhythmia cordis based on convolutional neural networks
CN108039203A (en) * 2017-12-04 2018-05-15 北京医拍智能科技有限公司 The detecting system of arrhythmia cordis based on deep neural network

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102968410A (en) * 2012-12-04 2013-03-13 江南大学 Text classification method based on RBF (Radial Basis Function) neural network algorithm and semantic feature selection
CN104965819A (en) * 2015-07-12 2015-10-07 大连理工大学 Biomedical event trigger word identification method based on syntactic word vector
CN106407211A (en) * 2015-07-30 2017-02-15 富士通株式会社 Method and device for classifying semantic relationships among entity words
CN107092596A (en) * 2017-04-24 2017-08-25 重庆邮电大学 Text emotion analysis method based on attention CNNs and CCR
CN107291795A (en) * 2017-05-03 2017-10-24 华南理工大学 A kind of dynamic word insertion of combination and the file classification method of part-of-speech tagging
CN107239443A (en) * 2017-05-09 2017-10-10 清华大学 The training method and server of a kind of term vector learning model
CN107229684A (en) * 2017-05-11 2017-10-03 合肥美的智能科技有限公司 Statement classification method, system, electronic equipment, refrigerator and storage medium
CN107102989A (en) * 2017-05-24 2017-08-29 南京大学 A kind of entity disambiguation method based on term vector, convolutional neural networks
CN107656921A (en) * 2017-10-10 2018-02-02 上海数眼科技发展有限公司 A kind of short text dependency analysis method based on deep learning
CN108030488A (en) * 2017-11-30 2018-05-15 北京医拍智能科技有限公司 The detecting system of arrhythmia cordis based on convolutional neural networks
CN108039203A (en) * 2017-12-04 2018-05-15 北京医拍智能科技有限公司 The detecting system of arrhythmia cordis based on deep neural network
CN107992941A (en) * 2017-12-28 2018-05-04 武汉璞华大数据技术有限公司 A kind of contract terms sorting technique

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021227293A1 (en) * 2020-05-09 2021-11-18 烽火通信科技股份有限公司 Universal training method and system for artificial intelligence models

Similar Documents

Publication Publication Date Title
US20210065058A1 (en) Method, apparatus, device and readable medium for transfer learning in machine learning
CN104067314B (en) Humanoid image partition method
CN109765462A (en) Fault detection method, device and the terminal device of transmission line of electricity
CN107239443A (en) The training method and server of a kind of term vector learning model
CN106982359A (en) A kind of binocular video monitoring method, system and computer-readable recording medium
CN107526831A (en) A kind of natural language processing method and apparatus
CN108364550B (en) Experience type interactive process design method for extra-high voltage electric virtual training
CN111104732A (en) Intelligent planning method for mobile communication network based on deep reinforcement learning
CN107016212A (en) Intention analysis method based on dynamic Bayesian network
CN110728182A (en) Interviewing method and device based on AI interviewing system and computer equipment
CN114638442B (en) Flight training scheme generation system, method and equipment for individual difference
DE112021000689T5 (en) ATTESTATION OF NEURAL PROCESSES
CN108829777A (en) A kind of the problem of chat robots, replies method and device
CN111461284A (en) Data discretization method, device, equipment and medium
CN108647785A (en) A kind of neural network method for automatic modeling, device and storage medium
CN106708950A (en) Data processing method and device used for intelligent robot self-learning system
Madni et al. Augmenting MBSE with Digital Twin Technology: Implementation, Analysis, Preliminary Results, and Findings
CN110162769A (en) Text subject output method and device, storage medium and electronic device
CN108564134A (en) Data processing method, device, computing device and medium
Shafaat et al. Exploring the role of design in systems engineering
Bushuyev et al. Cognitive Readiness of Managing Infrastructure Projects Driving by SMARTification
Mutoh et al. A model of friendship networks based on social network analysis
CN109460485A (en) A kind of image library method for building up, device and storage medium
CN109858006A (en) Subject recognition training method, apparatus
CN111598105A (en) Patent map construction method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200309

Address after: 519000 room 105-58115, No. 6, Baohua Road, Hengqin New District, Zhuhai City, Guangdong Province (centralized office area)

Applicant after: Puqiang times (Zhuhai Hengqin) Information Technology Co., Ltd

Address before: 100089 Haidian District, Beijing, Yongfeng Road, North Road, South East Road, F, 2 floor.

Applicant before: Puqiang Information Technology (Beijing) Co., Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20181012