JP2013058095A5 - - Google Patents

Download PDF

Info

Publication number
JP2013058095A5
JP2013058095A5 JP2011196300A JP2011196300A JP2013058095A5 JP 2013058095 A5 JP2013058095 A5 JP 2013058095A5 JP 2011196300 A JP2011196300 A JP 2011196300A JP 2011196300 A JP2011196300 A JP 2011196300A JP 2013058095 A5 JP2013058095 A5 JP 2013058095A5
Authority
JP
Japan
Prior art keywords
distribution
learning data
function
input
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2011196300A
Other languages
Japanese (ja)
Other versions
JP5909943B2 (en
JP2013058095A (en
Filing date
Publication date
Application filed filed Critical
Priority to JP2011196300A priority Critical patent/JP5909943B2/en
Priority claimed from JP2011196300A external-priority patent/JP5909943B2/en
Priority to US13/591,520 priority patent/US20130066452A1/en
Priority to CN201210320527.3A priority patent/CN103177177B/en
Publication of JP2013058095A publication Critical patent/JP2013058095A/en
Publication of JP2013058095A5 publication Critical patent/JP2013058095A5/ja
Application granted granted Critical
Publication of JP5909943B2 publication Critical patent/JP5909943B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

(オンライン学習の効果について)
上記の言語解析機の自動構築方法を用いて実験を行った。その実験結果を図39に示す。図39に示したグラフの横軸は経過時間(単位は日)であり、縦軸は平均F値(Average F−Measures)である。また、実線(Online,1k)及び破線(Online,4k)は、オンライン学習による学習用データセットの逐次更新を続けた場合の実験結果を示している。一方、鎖線(Offline,1k)及び一点鎖線(Offline,4k)は、オフライン学習による実験結果を示している。なお、1kは、推定機の構築に利用する学習用データの数を1000個に設定したことを示している。一方、4kは、推定機の構築に利用する学習用データの数を4000個に設定したことを示している。
(About the effects of online learning)
Experiments were conducted using the above-mentioned automatic construction method for language analyzers. The experimental results are shown in FIG. The horizontal axis of the graph shown in FIG. 39 is the elapsed time (the unit is days), and the vertical axis is the average F value (Average F-Measures). In addition, a solid line (Online, 1k) and a broken line (Online, 4k) indicate experimental results when the learning data set is continuously updated by online learning. On the other hand, a chain line ( Offline , 1k) and an alternate long and short dash line ( Offline , 4k) indicate the experimental results by offline learning. Note that 1k indicates that the number of learning data used to construct the estimator is set to 1000. On the other hand, 4k indicates that the number of learning data used to construct the estimator is set to 4000.

Claims (10)

入力データ及び当該入力データに対応する目的変数で構成される複数の学習用データが与えられた場合に、複数の基底関数に前記入力データを入力して、前記各基底関数の出力値を要素とする特徴量ベクトルを算出する特徴量ベクトル算出部と、
特徴量空間において前記特徴量ベクトルにより特定される点の分布が所定の分布に近づくように当該点の分布を調整する分布調整部と、
複数の前記学習用データについて、前記特徴量ベクトルの入力に応じて前記目的変数の推定値を出力する推定関数を生成する関数生成部と、
を備える、
情報処理装置。
When a plurality of learning data composed of input data and an objective variable corresponding to the input data is given, the input data is input to a plurality of basis functions, and an output value of each basis function is defined as an element A feature amount vector calculating unit for calculating a feature amount vector to be
A distribution adjusting unit that adjusts the distribution of the points so that the distribution of the points specified by the feature vector in the feature amount space approaches a predetermined distribution;
For a plurality of learning data, a function generation unit that generates an estimation function that outputs an estimated value of the objective variable in accordance with an input of the feature vector;
Comprising
Information processing device.
前記分布調整部は、特徴量空間において前記特徴量ベクトルにより特定される点の分布が所定の分布に近づくように前記学習用データを間引く、
請求項1に記載の情報処理装置。
The distribution adjustment unit thins out the learning data so that a distribution of points specified by the feature vector in a feature space approaches a predetermined distribution;
The information processing apparatus according to claim 1.
前記分布調整部は、特徴量空間において前記特徴量ベクトルにより特定される点の分布が所定の分布に近づくように前記各学習用データに対して重み付けする、
請求項1に記載の情報処理装置。
The distribution adjusting unit weights the learning data so that a distribution of points specified by the feature vector in the feature space approaches a predetermined distribution;
The information processing apparatus according to claim 1.
前記分布調整部は、特徴量空間において前記特徴量ベクトルにより特定される点の分布が所定の分布に近づくように、前記学習用データを間引き、かつ、間引き後に残った学習用データの各々に対して重み付けする、
請求項1に記載の情報処理装置。
The distribution adjustment unit thins the learning data so that the distribution of points specified by the feature vector in the feature amount space approaches a predetermined distribution, and for each of the learning data remaining after the thinning And weight,
The information processing apparatus according to claim 1.
前記所定の分布は、一様分布又はガウス分布である、
請求項1〜4のいずれか一項に記載の情報処理装置。
The predetermined distribution is a uniform distribution or a Gaussian distribution.
The information processing apparatus according to any one of claims 1 to 4 .
前記分布調整部は、追加的に新たな学習用データが与えられた場合に、当該新たな学習用データと既存の学習用データとを含む学習用データ群を対象に、特徴量空間において前記特徴量ベクトルにより特定される点の分布が所定の分布に近づくように前記学習用データを間引く、
請求項2または4に記載の情報処理装置。
When the distribution adjustment unit is additionally provided with new learning data, the distribution adjustment unit targets the learning data group including the new learning data and the existing learning data in the feature amount space. The learning data is thinned out so that the distribution of points specified by the quantity vector approaches a predetermined distribution.
The information processing apparatus according to claim 2 or 4 .
予め用意された複数の関数を組み合わせて前記基底関数を生成する基底関数生成部をさらに備える、
請求項1〜6のいずれか一項に記載の情報処理装置。
A basis function generation unit that generates the basis function by combining a plurality of functions prepared in advance;
The information processing apparatus according to any one of claims 1 to 6 .
前記基底関数生成部は、遺伝的アルゴリズムに基づいて前記基底関数を更新し、
前記特徴量ベクトル算出部は、前記基底関数が更新された場合に、更新後の前記基底関数に前記入力データを入力して特徴量ベクトルを算出し、
前記関数生成部は、前記更新後の基底関数を用いて算出された特徴量ベクトルの入力に応じて前記目的変数の推定値を出力する推定関数を生成する、
請求項7に記載の情報処理装置。
The basis function generation unit updates the basis function based on a genetic algorithm,
When the basis function is updated, the feature amount vector calculation unit calculates the feature amount vector by inputting the input data to the updated basis function,
The function generation unit generates an estimation function that outputs an estimated value of the objective variable in accordance with an input of a feature vector calculated using the updated basis function;
The information processing apparatus according to claim 7.
入力データ及び当該入力データに対応する目的変数で構成される複数の学習用データが与えられた場合に、複数の基底関数に前記入力データを入力して、前記各基底関数の出力値を要素とする特徴量ベクトルを算出するステップと、
特徴量空間において前記特徴量ベクトルにより特定される点の分布が所定の分布に近づくように当該点の分布を調整するステップと、
複数の前記学習用データについて、前記特徴量ベクトルの入力に対して前記目的変数の推定値を出力する推定関数を生成するステップと、
を含む、
推定機生成方法。
When a plurality of learning data composed of input data and an objective variable corresponding to the input data is given, the input data is input to a plurality of basis functions, and an output value of each basis function is defined as an element Calculating a feature vector to be
Adjusting the distribution of the points so that the distribution of the points specified by the feature vector in the feature amount space approaches a predetermined distribution;
Generating an estimation function that outputs an estimated value of the objective variable with respect to an input of the feature quantity vector for a plurality of the learning data;
including,
Estimator generation method.
入力データ及び当該入力データに対応する目的変数で構成される複数の学習用データが与えられた場合に、複数の基底関数に前記入力データを入力して、前記各基底関数の出力値を要素とする特徴量ベクトルを算出する特徴量ベクトル算出機能と、
特徴量空間において前記特徴量ベクトルにより特定される点の分布が所定の分布に近づくように当該点の分布を調整する分布調整機能と、
複数の前記学習用データについて、前記特徴量ベクトルの入力に応じて前記目的変数の推定値を出力する推定関数を生成する関数生成機能と、
をコンピュータに実現させるためのプログラム。
When a plurality of learning data composed of input data and an objective variable corresponding to the input data is given, the input data is input to a plurality of basis functions, and an output value of each basis function is defined as an element A feature vector calculation function for calculating a feature vector to be
A distribution adjusting function for adjusting the distribution of the points so that the distribution of the points specified by the feature vector in the feature amount space approaches a predetermined distribution;
A function generating function for generating an estimation function that outputs an estimated value of the objective variable in response to an input of the feature vector for a plurality of the learning data;
A program to make a computer realize.
JP2011196300A 2011-09-08 2011-09-08 Information processing apparatus, estimator generation method, and program Expired - Fee Related JP5909943B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2011196300A JP5909943B2 (en) 2011-09-08 2011-09-08 Information processing apparatus, estimator generation method, and program
US13/591,520 US20130066452A1 (en) 2011-09-08 2012-08-22 Information processing device, estimator generating method and program
CN201210320527.3A CN103177177B (en) 2011-09-08 2012-08-31 Message processing device and estimator generation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011196300A JP5909943B2 (en) 2011-09-08 2011-09-08 Information processing apparatus, estimator generation method, and program

Publications (3)

Publication Number Publication Date
JP2013058095A JP2013058095A (en) 2013-03-28
JP2013058095A5 true JP2013058095A5 (en) 2014-09-25
JP5909943B2 JP5909943B2 (en) 2016-04-27

Family

ID=48133934

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011196300A Expired - Fee Related JP5909943B2 (en) 2011-09-08 2011-09-08 Information processing apparatus, estimator generation method, and program

Country Status (1)

Country Link
JP (1) JP5909943B2 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5874292B2 (en) * 2011-10-12 2016-03-02 ソニー株式会社 Information processing apparatus, information processing method, and program
CN108465244B (en) * 2018-03-30 2019-05-07 腾讯科技(深圳)有限公司 AI method for parameter configuration, device, equipment and storage medium for racing class AI model
JP7259935B2 (en) * 2019-03-04 2023-04-18 日本電気株式会社 Information processing system, information processing method and program
JP7270894B1 (en) * 2022-12-09 2023-05-11 株式会社Creator’s NEXT Identification of digital data of new styles

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4948118B2 (en) * 2005-10-25 2012-06-06 ソニー株式会社 Information processing apparatus, information processing method, and program
JP4670662B2 (en) * 2006-01-26 2011-04-13 パナソニック電工株式会社 Anomaly detection device
JP5126694B2 (en) * 2009-07-21 2013-01-23 日本電気株式会社 Learning system

Similar Documents

Publication Publication Date Title
JP6827539B2 (en) Training action selection neural networks
JP2018526733A5 (en)
JP6824382B2 (en) Training machine learning models for multiple machine learning tasks
JP2013084175A5 (en)
JP2013524355A5 (en)
WO2019018375A1 (en) Neural architecture search for convolutional neural networks
JP2021507323A5 (en)
WO2017091629A1 (en) Reinforcement learning using confidence scores
JP2011003186A5 (en)
WO2018189404A1 (en) Distributional reinforcement learning
JP2014517602A5 (en)
US8635174B2 (en) Information processing apparatus, observation value prediction method, and program
CN105989374B (en) Method and equipment for training model on line
WO2016058485A3 (en) Methods and devices for calculating ranking score and creating model, and product recommendation system
JP2015210750A5 (en)
US9436907B2 (en) Method and system for calculating value of website visitor
JP2013058095A5 (en)
CN108376284A (en) Control device and control method
MX2014013721A (en) Hidden-variable-model estimation device and method.
JP2012190061A5 (en)
JP2012165947A5 (en)
JP2014214566A5 (en) Excavator processing apparatus and work content determination method
WO2015082107A3 (en) Method and device for determining a data-based functional model
Zhou et al. LSSVM and hybrid particle swarm optimization for ship motion prediction
IN2013MU03240A (en)