JP2021068456A - Calculation technique for eliminating "multicollinearity" or the like in regression analysis, and obtaining partial regression coefficient indicating contribution to proper objective variable of explanatory variable, as management material - Google Patents

Calculation technique for eliminating "multicollinearity" or the like in regression analysis, and obtaining partial regression coefficient indicating contribution to proper objective variable of explanatory variable, as management material Download PDF

Info

Publication number
JP2021068456A
JP2021068456A JP2020183948A JP2020183948A JP2021068456A JP 2021068456 A JP2021068456 A JP 2021068456A JP 2020183948 A JP2020183948 A JP 2020183948A JP 2020183948 A JP2020183948 A JP 2020183948A JP 2021068456 A JP2021068456 A JP 2021068456A
Authority
JP
Japan
Prior art keywords
variable
multicollinearity
variables
correlation
sales
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2020183948A
Other languages
Japanese (ja)
Other versions
JP2021068456A5 (en
Inventor
雅浩 白井
Masahiro Shirai
雅浩 白井
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to JP2020183948A priority Critical patent/JP2021068456A/en
Publication of JP2021068456A publication Critical patent/JP2021068456A/en
Publication of JP2021068456A5 publication Critical patent/JP2021068456A5/ja
Pending legal-status Critical Current

Links

Abstract

To solve the problem that a partial regression coefficient bi of an explanatory variable Xi of a cause indicating contribution to an effect Y is alienated from theory and experience, due to the presence of a large number of numerical values with a fairly high degree of correlation to an objective variable Y among selected explanatory variables of causes, in multiple regression analysis that is applied for relationship analysis between an industrial effect Y and its causes X.SOLUTION: Provided is a calculation technique for eliminating "multicollinearity" or the like. In order to eliminate this discrepancy, a determinant of correlation coefficients of an explanatory variable Xi and an effect Y of a normal equation in multiple regression analysis is created, and by using a diagonal element Xi that is a main force variable of a matrix since numeric balance of both right and left sides is not achieved, a portion overlapping with Xi is removed from Xj of a subordinate non-diagonal factor, and then, the remainder is used as a new non-diagonal factor of the matrix in a normal equation for which bi of a new correlation determinant is obtained.

Description

本発明は、産業上の成果である売上高等と原因である販売経費等の関係を重回帰分析で行う際に生ずる“多重共線性”等の問題を解消し、売上高に寄与する販売経費などの原因変数の寄与度を示す適切な偏回帰係数biを求めることを目的とする計算技術である。 The present invention solves problems such as "multicollinearity" that occur when the relationship between sales, etc., which is an industrial result, and sales expenses, etc., which is the cause, is performed by multiple regression analysis, and sales expenses, etc. that contribute to sales. This is a calculation technique for the purpose of obtaining an appropriate partial regression coefficient bi indicating the contribution of the causative variable of.

重回帰分析を適用し、成果と原因の関係を解明する場合に、説明変数に“多重共線性”等の問題が生じ、適切な経営判断ができない。このことを次の仮設分析例で“多重共線性”による偏回帰係数biの歪みを、また、説明変数間にある程度高い相関のある場合のbiの歪みを示す。 When applying multiple regression analysis to elucidate the relationship between results and causes, problems such as "multicollinearity" occur in the explanatory variables, and appropriate management decisions cannot be made. In the following hypothetical analysis example, the distortion of the partial regression coefficient bi due to "multicollinearity" is shown, and the distortion of bi when there is a high correlation between the explanatory variables to some extent.

重回帰式 Y=b0+b1X1+b2X2 (1)
基準化した変数の重回帰式
Ay=bA1+b2A2 (2)
仮設分析例
売上高とその原因である人件費、売場面積は次のようであるとする。
売上高(百万円) 人件費(10万円) 売場面積(10m
Y X1 X2
A店 46 97 75
B店 18 61 59
C店 36 83 72
D店 22 37 51
E店 27 75 48
この資料に対して説明変数が2ケである場合の重回帰分析を適用する。
この計算結果は、次の通りである。
売上高対人件費の相関係数 R1 :0.84
売上高対売場面積の相関係数 R2 :0.79
人件費対売場面積の相関係数 r12:0.72
b1=0.56 b2=0.38
売上高に対する人件費、売場面積の相関係数にそれほど差がないのに、原因の成果に対する寄与度を示す偏回帰係数b1、b2の数値に大きな差がある。これは人件費、売場面積の間に相当程度の高い相関があることによるものであり、実務経験者である経営者もこの分析結果の妥当性に疑問を持つ。
Multiple regression equation Y = b0 + b1X1 + b2X2 (1)
Multiple regression equations for standardized variables
Ay = bA1 + b2A2 (2)
Temporary analysis example Sales, labor costs, and sales floor area, which are the causes, are as follows.
Sales (million yen) Personnel costs (100,000 yen) Sales floor area (10m 2 )
Y X1 X2
Store A 46 97 75
Store B 18 61 59
C store 36 83 72
Store D 22 37 51
E store 27 75 48
A multiple regression analysis is applied to this material when there are two explanatory variables.
The result of this calculation is as follows.
Correlation coefficient between sales and labor costs R1: 0.84
Correlation coefficient of sales to sales floor area R2: 0.79
Correlation coefficient between labor cost and sales floor area r12: 0.72
b1 = 0.56 b2 = 0.38
Although there is not much difference in the correlation coefficient between labor cost and sales floor area with respect to sales, there is a large difference in the numerical values of the partial regression coefficients b1 and b2, which indicate the degree of contribution of the cause to the result. This is due to the fact that there is a fairly high correlation between labor costs and sales floor area, and even managers with practical experience have doubts about the validity of this analysis result.

分析例2(実施例3より、分析結果だけを示す。)
事例名 J K L
目的変数に対する
相関係数:R 0.51 0.62 0.54
偏回帰係数: bi 0.10 0,43 0.33
これは適当に作られた事例についての分析だが、事例Jの相関係数に対する偏回帰係数の数値が小さい。これは実施例の説明変数相互の相関係数の数値が0.5前後とやや大きいことによるが、この程度の数値なら特別高い相関とは言えない。
Analysis Example 2 (From Example 3, only the analysis results are shown.)
Case name JKL
For the objective variable
Correlation coefficient: R 0.51 0.62 0.54
Partial regression coefficient: bi 0.10 0,43 0.33
This is an analysis of a properly made case, but the numerical value of the partial regression coefficient with respect to the correlation coefficient of case J is small. This is because the value of the correlation coefficient between the explanatory variables of the examples is a little large, around 0.5, but it cannot be said that the correlation is particularly high if the value is about this level.

回帰分析は一組(X1,X2,..,Xp)の説明変数が目的変数Yをどれだけ説明できるかの分析で、原因となるXiの数値は物理学、化学等による独立性の高いものであったとされ、一組を構成する構成要素個々のXiの貢献度を求めるものではなかったとされる。その後、社会経済現象の成果である企業の売上高とその原因である販売諸経費の関係分析に、重回帰分析が利用されるようになると、採用する諸原因は売上高と高い因果関係のある人件費、販売経費、売場面積等が選択されるようになり、売上高数値を媒介としてそれら原因間の相関係数も高く、売上高への寄与度を示すbiも常識、理論に反する異常値を示すようになった。これが”多重共線性“の問題である。 Regression analysis is an analysis of how much an explanatory variable of a set (X1, X2, ..., Xp) can explain the objective variable Y, and the numerical value of Xi that causes it is highly independent by physics, chemistry, etc. It is said that the contribution of Xi of each component constituting the set was not calculated. After that, when multiple regression analysis was used to analyze the relationship between corporate sales, which is the result of socio-economic phenomena, and sales expenses, which are the causes, the causes to be adopted have a high causal relationship with sales. Personnel costs, sales costs, sales floor area, etc. have come to be selected, the correlation coefficient between these causes is high through the sales figures, and bi, which indicates the degree of contribution to sales, is an abnormal value contrary to common sense and theory. Came to show. This is the problem of "multicollinearity".

基準化変数
上記の(1)式について、説明変数Xiの数値単位が人数であったり、面積mであったりすると、求められるbiの数値の意味が曖昧となるので、以後の分析で多用される相関行列の作成に変数を基準化する必要があることから、変数は全て基準化数値を使用する。
以下A1,A2はX1、X2の基準化した変数である。
基準化 :Ai=(Xi−Xi)/σi (3)
Xi:Xiの平均 σi:Xiの標準偏差
なお、Y,X1,X2を基準化することにより、Aiの数値は平均0、標準偏差1の数値集団となる。
Standardized variable Regarding the above equation (1), if the numerical unit of the explanatory variable Xi is the number of people or the area m 2 , the meaning of the required numerical value of bi becomes ambiguous, so it is often used in the subsequent analysis. Since it is necessary to standardize the variables to create the correlation matrix, all variables use the standardized numerical values.
Hereinafter, A1 and A2 are standardized variables of X1 and X2.
Standardization: Ai = ( Xi-Xi ) / σi (3)
Xi : Average of Xi σi: Standard deviation of Xi By standardizing Y, X1 and X2, the numerical value of Ai becomes a numerical group having an average of 0 and a standard deviation of 1.

正規方程式の相関行列
以下の分析では主に重回帰分析の正規方程式の相関行列式を多用するので、ここでこれを導入する。
相関係数
ΣA1A1/n=1 ΣA1A2/n=r12 n:標本数
正規方程式はこれで次のように表示できる。

Figure 2021068456
Correlation matrix of normal equations In the following analysis, the correlation matrix equation of the normal equations of multiple regression analysis is mainly used, so this is introduced here.
Correlation coefficient ΣA1A1 / n = 1 ΣA1A2 / n = r12 n: Number of samples The normal equation can now be displayed as follows.
Figure 2021068456

特許文献1
特許第5120820号「”多重共線性“を解消する回帰分析によって適正な標準宅地の評価を行う装置」
”多重共線性”の解決は、正規方程式の行列表示で、左右両辺の項目数値のアンバランスをどのようにして数式等号を満足させるかである。簡単に言えば左辺2項目は右辺1項目の2倍の相関値(2倍)であり重複分の存在が予想され、この部分を除去する。
以下2変数の場合だが、目的変数である地価の媒介による説明変数相互の比較的高い相関部分を次のように除去して、行列第1行の非対角要素2項を構成している。
r12=r12(1−R1R2)
これは個々の事例についてではなく、一連の事例に対してR1R2の数値を一括除去する方法である。この除去により、それなりに”多重共線性“を解消する効果を期待できるが大まかな計算である。
特許文献2
特許第5585921号「産業上の成果である売上高等と原因である販売経費等の関係分析に”多重共線性“解消の重回帰分析を適用して解明する装置」
この特許の”多重共線性“解消は”特許文献1“の除去に比して、個別の事例を対象としたより丁寧な除去であるが、基準化した説明変数そのものに対する除去であるため、事例を基準化変数の正と負の符号同調事例に分類し、それぞれについて行うもので、一貫性がなく、求めた相関行列も非対称行列になっている。
Patent Document 1
Patent No. 5120820 "A device that evaluates an appropriate standard residential land by regression analysis that eliminates" multicollinearity ""
The solution to "multicollinearity" is how to satisfy the equation equal sign with the imbalance of the item values on both the left and right sides in the matrix display of the normal equation. Simply put, the two items on the left side have twice the correlation value (twice) of the one item on the right side, and the existence of overlapping parts is expected, and this part is removed.
In the case of the following two variables, the relatively high correlation between the explanatory variables mediated by the land price, which is the objective variable, is removed as follows to form the off-diagonal element 2 term of the first row of the matrix.
r12 = r12 (1-R1R2)
This is a method of collectively removing the numerical values of R1R2 for a series of cases, not for individual cases. This removal can be expected to have the effect of eliminating "multicollinearity" as it is, but it is a rough calculation.
Patent Document 2
Patent No. 5585921 "A device that applies multiple regression analysis to eliminate" multicollinearity "to analyze the relationship between sales, etc., which is an industrial achievement, and sales expenses, etc., which is the cause."
Compared to the removal of "Patent Document 1", the "multicollinearity" elimination of this patent is a more polite removal for individual cases, but since it is a removal for the standardized explanatory variable itself, the case Is classified into positive and negative sign tuning cases of the standardized variable, and is performed for each case. It is inconsistent and the obtained correlation matrix is also an asymmetric matrix.

多重共線性“回避の諸策
”多重共線性“問題を回避するために一般に次のような方法が考えられている。
相関の高い説明変数のどちらか一つを削除
主成分分析
リッジ回帰
説明変数の削除は成果である売上高に対して有意に選択された原因である従業員数や販売経費であり、これを省くのは適切でない。
主成分分析はP個の変数X1,X2,..Xpをある条件の下にp>mの綜合変数Xj(j=1〜m)に要約する方法で、説明変数の係数biを求めるものではない。
リッジ回帰は相関行列の対角要素に一定数kを加えて偏回帰係数を求めるもので、概要を次に示す。

Figure 2021068456
これは、非対角要素の相関係数r12の数値を小さくする効果があり、その分だけ“多重共線性”が解消されるが、相関行列の対角要素に加える加算数値の選択に確たる基準がなく、恣意的であまり利用されていないと思う。
Arthur E.Hoer and Robert,W.Kennard “Ridge regression”1970年 2月 Multicollinearity "Countermeasures for avoidance" Multicollinearity In order to avoid the problem, the following methods are generally considered.
Delete one of the highly correlated explanatory variables Principal component analysis Ridge regression Deletion of the explanatory variables is the number of employees and sales expenses that are significantly selected for the resulting sales and is omitted. Is not appropriate.
Principal component analysis is performed on P variables X1, X2. .. The coefficient bi of the explanatory variable is not obtained by the method of summarizing Xp into the integrated variable Xj (j = 1 to m) of p> m under a certain condition.
Ridge regression obtains the partial regression coefficient by adding a certain number of k to the diagonal elements of the correlation matrix, and the outline is shown below.
Figure 2021068456
This has the effect of reducing the value of the correlation coefficient r12 of the off-diagonal element, and the "multicollinearity" is eliminated by that amount, but it is a reliable criterion for selecting the additional value to be added to the diagonal element of the correlation matrix. I don't think it's used, it's arbitrary and it's not used much.
Arthur E. Hoer and Robert, W. et al. Kennard “Ridge regression” February 1970

リッジ回帰の”多重共線性“解消は対角要素を対象として非対角要素数値を小さくする間接的な方法であるのに対して、本発明は相関行列の非対角要素を直接の分析対象とする。 While the "multicollinearity" elimination of ridge regression is an indirect method of reducing the off-diagonal element values for diagonal elements, the present invention directly analyzes the off-diagonal elements of the correlation matrix. And.

数式(4)の第1行をみれば、左辺は原因変数X1とX2との相関係数の和で、(1+r12)は、変数選択の経緯からr12は0より大きい正数と考えられるから1より大きい数値となり、右辺のR1(1>R1>0)より数値が大となる。だから、左辺、右辺の等号を満足させるには次のような方法が考えられる。
第1の方法は現行の回帰分析の手法で、いわば強引に左右両辺の等号を図る方法と言える。多重共線性“問題を発生させる。
第2の方法は、数式(4)が原因変数2ケ以上で、左辺各項の原因変数値に重複が予想される場合に、その重複分を除去するものである。この場合でもなお左右両辺の数値は残るが大幅に緩和される。この操作の適否は、除去方法が理論的、実務的に合理的であるか否かによる。左辺第1行の最初の項はX1の相乗であり、右辺はまた目的変数YとX1だけの相関なので、この行に関してX2は従属的立場の変数である。だから、変数間の重複分を除去するとすれば主力的なXiを基準に従属的立場のX2から除去するのが適切と考える。
Looking at the first line of equation (4), the left side is the sum of the correlation coefficients of the causative variables X1 and X2, and (1 + r12) is because r12 is considered to be a positive number greater than 0 from the background of variable selection. The value is larger than R1 (1>R1> 0) on the right side. Therefore, the following methods can be considered to satisfy the equal signs on the left and right sides.
The first method is the current regression analysis method, which can be said to be a method of forcibly aiming for equal signs on both the left and right sides. Multicollinearity "causes problems.
The second method is to remove the duplication when the formula (4) has two or more causal variables and the causal variable values of each term on the left side are expected to be duplicated. Even in this case, the values on both the left and right sides remain, but they are greatly relaxed. The suitability of this operation depends on whether the removal method is theoretically and practically rational. Since the first term of the first row on the left side is the synergy of X1 and the right side is also the correlation of only the objective variables Y and X1, X2 is a variable in the dependent position with respect to this row. Therefore, if we want to remove the duplication between variables, we think that it is appropriate to remove it from the subordinate X2 based on the main Xi.

1 基準化変数Aiは、これに一定数を加算、乗算しても元のAiの数値は不変である。基準化式(3)参照
2 ΣA1A2/n=r12 である。
3 Aiの数値のままでの重複分除去は、負数がある場合は計算が複雑になるので、除去計算で全て正数になるように、又、全てのAi数値が統一された数値となるように一定数(本実施例ではAの負数最大値を目途に.1.8)を加算する。
4 左辺(対角要素A1、非対角要素A2)について、A2からA1と同調する部分を下記の通り除去するため、A2とA1の絶対値のどちらか低い数値を限度としてA2から除去し、残余分を求める。
除去例 A1 A2 除去分 残余分
0.3 0.2 0.2 0
0.2 0.5 0.2 0.3
なお、上記3の一定数加算で実施例のA1,A2の数値総和は18と等しくなる。
このことから、次の簡単な例で理解できるように正規方程式は、正方行列で斜線状に位置する対角要素の対称の位置にある非対角要素の数値は等しくなる。
A1、A2の数値総和を等しく0.7とする。
A1が対角要素(除去基準)でA2が除去される非対角要素である場合
A1 A2 除去分 残余分
0.3 0.5 0.3 0.2
0.4 0.2 0.2 0
逆の場合
A2 A1 除去分 残余分
0.5 0.3 0.3 0
0.2 0.4 0.2 0.2
このため正規方程式の相関行列の斜線状の対角要素を基準とした対称の位置にある非対角要素の数値は等しい。
5 除去残余分はA2の一部なので、これをA1*A2(これは重複分除去前の相関係数r12である。数式(4)参照)に乗じ、改めてr12を求めて重複分除去後の新相関行列を作り、重複分除去後のbi(i=1,2)を求める。
1 Even if a certain number of the standardized variable Ai is added or multiplied, the original numerical value of Ai does not change. See standardization formula (3) 2 ΣA1A2 / n = r12.
3 When removing duplicates with Ai values as they are, the calculation becomes complicated when there are negative numbers, so make sure that all Ai values are positive in the removal calculation, and that all Ai values are unified. Add a certain number (1.8 in this example, aiming at the maximum negative number of A).
4 Regarding the left side (diagonal element A1, off-diagonal element A2), in order to remove the part synchronized with A1 from A2 as follows, remove from A2 up to the lower of the absolute values of A2 and A1. Find the remainder.
Removal example A1 A2 Removal residual residue
0.3 0.2 0.2 0
0.2 0.5 0.2 0.3
It should be noted that the sum of the numerical values of A1 and A2 in the embodiment becomes equal to 18 by adding one constant of 3 above.
From this, as can be understood in the following simple example, in the normal equation, the numerical values of the off-diagonal elements at the symmetrical positions of the diagonal elements located diagonally in the square matrix are equal.
Let the sum of the numerical values of A1 and A2 be equal to 0.7.
When A1 is a diagonal element (removal standard) and A2 is an off-diagonal element to be removed A1 A2 Removal residual amount 0.3 0.5 0.3 0.2
0.4 0.2 0.2 0
In the opposite case A2 A1 Removal amount Residual 0.5 0.3 0.3 0
0.2 0.4 0.2 0.2
Therefore, the numerical values of the off-diagonal elements at symmetrical positions with respect to the diagonal diagonal elements of the correlation matrix of the normal equation are equal.
5 Since the removal residue is a part of A2, this is multiplied by A1 * A2 (this is the correlation coefficient r12 before removing the duplication. See formula (4)) to obtain r12 again and after removing the duplication. Create a new correlation matrix and find the bi (i = 1, 2) after removing the duplicates.

求めたbiの妥当性を確認するため、目的変数Yに対する説明変数の元の最高相関係数を100とした場合のbiの相対比を求めると、元の相関数値に近く,“多重共線性”等が解消されていることが分る。 In order to confirm the validity of the obtained bi, the relative ratio of bi when the original maximum correlation coefficient of the explanatory variable with respect to the objective variable Y is set to 100 is close to the original correlation value, and "multicollinearity". It can be seen that such things have been resolved.

実施手順Implementation procedure

1 多重回帰分析の正規方程式の目的変数Y,説明変数Xiを、説明変数の計測単位による変動を抑制するために、また、正規方程式の相関行列を作成するために各変数の基準化を行う。
2 基準化変数の一定数加算.乗算により数値が変化しない特性を利用する。
3 正規方程式の各行の数値ために、また、重複分除去により負数を生じさせないようにするために基準化変数に一定数の加算をする。
4 対角である主力変数を基準とした非対角要素の従属変数の重複分除去は[0012]の計算を適用する。
5 求められた残余の数値は行列の非対角要素である従属変数の一部なので[0012]、改めて正規方程式の非対角要素の相関係数とする新行列式を作る。
6 以上の演算よる正規方程式の新行列から、目的変数Yへの寄与度を示すbiを求める。
7 ここまでの“多重共線性”等解消の演算方法が適切である否かを検証するために、同一事例に対して説明変数を順次追加させた場合の当該方法の妥当性を検証する。これを見ると説明変数4ケの場合に適用した“多重共線性”等の解消の演算方法の結果は、元の目的変数に対する説明変数の相関係数の相対比格差に近く概ね妥当である。
実施例1
以下の目的変数Y,説明変数Xh,Xi,Xj,Xk,Xlは適当に設定した数値です。計算を簡単にするため各変数相互の相関係数を求める。

Figure 2021068456
実施例2 2変数、3変数、4変数の計算例
2変数の場合
Figure 2021068456
3変数の場合
Figure 2021068456
4変数の場合
Figure 2021068456
Figure 2021068456
実施例3
Yに対する相関係数が0.5レベルの事例xj、Xk,Xlの3事例についてのbiを求める回帰分析である。
Figure 2021068456
1 The objective variable Y and the explanatory variable Xi of the normal equation of the multiple regression analysis are standardized in order to suppress the fluctuation of the explanatory variable depending on the measurement unit and to create the correlation matrix of the normal equation.
2 Add a constant of standardized variables. Use the characteristic that the numerical value does not change by multiplication.
3 Add a certain number to the reference variable for the numerical value of each line of the normal equation and to prevent the negative number from being generated by the duplication removal.
4 The calculation of [0012] is applied to the deduplication of the dependent variable of the off-diagonal element based on the main variable which is diagonal.
5 Since the obtained residual numerical value is a part of the dependent variable that is the off-diagonal element of the matrix [0012], a new determinant is created again as the correlation coefficient of the off-diagonal element of the normal equation.
6 Find the bi indicating the degree of contribution to the objective variable Y from the new matrix of the normal equations obtained by the above operations.
7 In order to verify whether the calculation method for eliminating "multicollinearity" and the like up to this point is appropriate, the validity of the method when explanatory variables are sequentially added to the same case is verified. Looking at this, the result of the calculation method for eliminating "multicollinearity" and the like applied in the case of four explanatory variables is close to the relative ratio difference of the correlation coefficient of the explanatory variables with respect to the original objective variable and is generally valid.
Example 1
The following objective variables Y and explanatory variables Xh, Xi, Xj, Xk, and Xl are numerical values set appropriately. Find the correlation coefficient between each variable to simplify the calculation.
Figure 2021068456
Example 2 Calculation example of 2 variables, 3 variables, and 4 variables In the case of 2 variables
Figure 2021068456
In the case of 3 variables
Figure 2021068456
In the case of 4 variables
Figure 2021068456
Figure 2021068456
Example 3
This is a regression analysis for obtaining bi for three cases xj, Xk, and Xl in which the correlation coefficient with respect to Y is 0.5 level.
Figure 2021068456

本発明は、産業上の成果である売上高等と原因である販売経費等の関係を重回帰分析で行う際に生ずる“多重共線性”等の問題を解消し、売上高に寄与する販売経費などの原因変数の寄与度を示す適切な偏回帰係数biを求めることを目的とする計算技術である。The present invention solves problems such as "multicollinearity" that occur when the relationship between sales, etc., which is an industrial result, and sales expenses, etc., which is the cause, is performed by multiple regression analysis, and sales expenses, etc. that contribute to sales. This is a calculation technique for the purpose of obtaining an appropriate partial regression coefficient bi indicating the contribution of the causative variable of.

重回帰分析を適用し、成果と原因の関係を解明する場合に、説明変数に“多重共線性”等の問題が生じ、適切な経営判断ができない。このことを次の仮設分析例で”多重共線性”による偏回帰係数biの歪みを、また、説明変数間にある程度高い相関のある場合のbiの歪みを示す。When applying multiple regression analysis to elucidate the relationship between results and causes, problems such as "multicollinearity" occur in the explanatory variables, and appropriate management decisions cannot be made. In the following hypothetical analysis example, the distortion of the partial regression coefficient bi due to "multicollinearity" is shown, and the distortion of bi when there is a high correlation between the explanatory variables to some extent.

重回帰式 Y=b0+b1X1+b2X2 (1)
基準化した変数の重回帰式
Ay=bA1+b2A2 (2)
仮設分析例
売上高とその原因である人件費、売場面積は次のようであるとする。
売上高(百万円) 人件費(10万円) 売場面積(10m
Y X1 X2
A店 46 97 75
B店 18 61 59
C店 36 83 72
D店 22 37 51
E店 27 75 48
この資料に対して説明変数が2ケである場合の重回帰分析を適用する。
この計算結果は、次の通りである。
売上高対人件費の相関係数 R1 :0.84
売上高対売場面積の相関係数 R2 :0.79
人件費対売場面積の相関係数 r12 :0.72
b1=0.56 b2=0.38
売上高に対する人件費、売場面積の相関係数にそれほど差がないのに、原因の成果に対
する寄与度を示す偏回帰係数b1、b2の数値に大きな差がある。これは人件費、売場
面積の間に相当程度の高い相関があることによるものであり、実務経験者である経営者
もこの分析結果の妥当性に疑問を持つ。
Multiple regression equation Y = b0 + b1X1 + b2X2 (1)
Multiple regression equation of standardized variables Ay = bA1 + b2A2 (2)
Temporary analysis example Sales, labor costs, and sales floor area, which are the causes, are as follows.
Sales (million yen) Personnel costs (100,000 yen) Sales floor area (10m 2 )
Y X1 X2
Store A 46 97 75
Store B 18 61 59
C store 36 83 72
Store D 22 37 51
E store 27 75 48
A multiple regression analysis is applied to this material when there are two explanatory variables.
The result of this calculation is as follows.
Correlation coefficient between sales and labor costs R1: 0.84
Correlation coefficient of sales to sales floor area R2: 0.79
Correlation coefficient between labor cost and sales floor area r12: 0.72
b1 = 0.56 b2 = 0.38
Although there is not much difference in the correlation coefficient between labor cost and sales floor area with respect to sales, there is a large difference in the numerical values of the partial regression coefficients b1 and b2, which indicate the degree of contribution to the result of the cause. This is due to the fact that there is a fairly high correlation between labor costs and sales floor area, and even managers with practical experience have doubts about the validity of this analysis result.

分析例2(実施例3より、分析結果だけを示す。)
事例名 J K L
目的変数に対する
相関係数:R 0.51 0.62 0.54
偏回帰係数: bi 0.10 0.43 0.33
これは適当に作られた事例についての分析だが、事例Jの相関係数に対する偏回帰係数
の数値が小さい。これは実施例の説明変数相互の相関係数の数値が0.5前後とやや大
きいことによるが、この程度の数値なら特別高い相関とは言えない。
Analysis Example 2 (From Example 3, only the analysis results are shown.)
Case name JKL
For the objective variable
Correlation coefficient: R 0.51 0.62 0.54
Partial regression coefficient: bi 0.10 0.43 0.33
This is an analysis of a properly made case, but the numerical value of the partial regression coefficient with respect to the correlation coefficient of case J is small. This is because the value of the correlation coefficient between the explanatory variables in the examples is a little large, around 0.5, but it cannot be said that the correlation is particularly high if the value is about this level.

回帰分析は一組(X1,X2,..,Xp)の説明変数が目的変数Yをどれだけ説明できるかの分析で、原因となるXiの数値は物理学、化学等による独立性の高いものであったとされ、一組を構成する構成要素個々のXiの貢献度を求めるものではなかったとされる。その後、社会経済現象の成果である企業の売上高とその原因である販売諸経費の関係分析に、重回帰分析が利用されるようになると、採用する諸原因は売上高と高い因果関係のある人件費、販売経費、売場面積等が選択されるようになり、売上高数値を媒介としてそれら原因間の相関係数も高く、売上高への寄与度を示すbiも常識、理論に反する異常値を示すようになった。これが”多重共線性“の問題である。Regression analysis is an analysis of how much an explanatory variable of a set (X1, X2, ..., Xp) can explain the objective variable Y, and the numerical value of Xi that causes it is highly independent by physics, chemistry, etc. It is said that the contribution of Xi of each component constituting the set was not calculated. After that, when multiple regression analysis was used to analyze the relationship between corporate sales, which is the result of socio-economic phenomena, and sales expenses, which are the causes, the causes to be adopted have a high causal relationship with sales. Personnel costs, sales costs, sales floor area, etc. have come to be selected, the correlation coefficient between these causes is high through the sales figures, and bi, which indicates the degree of contribution to sales, is an abnormal value contrary to common sense and theory. Came to show. This is the problem of "multicollinearity".

基準化変数
上記の(1)式について、説明変数Xiの数値単位が人数であったり面積mであったりすると、求められるbiの数値の意味が曖昧となるので、以後の分析で多用される相関行列の作成に変数を基準化する必要があることから、変数は全て基準化数値を使用する。
以下A1,A2はX1、X2の基準化した変数である。
基準化 :Ai=(Xi−Xi)/σi (3)
の平均 σi:Xiの標準偏差
なお、Y,X1,X2を基準化することにより、Aiの数値は平均0、標準偏差1の数値集団となる。
Standardized variable Regarding the above equation (1), if the numerical unit of the explanatory variable Xi is the number of people or the area m 2 , the meaning of the required numerical value of bi becomes ambiguous, so it is often used in the subsequent analysis. Since variables need to be standardized to create a correlation matrix, all variables use standardized numbers.
Hereinafter, A1 and A2 are standardized variables of X1 and X2.
Standardization: Ai = ( Xi-Xi ) / σi (3)
Average σi: Standard deviation of Xi By standardizing Y, X1 and X2, the numerical value of Ai becomes a numerical group with an average of 0 and a standard deviation of 1.

正規方程式の相関行列
以下の分析では主に重回帰分析の正規方程式の相関行列式を多用するので、ここでこれを導入する。

Figure 2021068456
Correlation matrix of normal equations In the following analysis, the correlation matrix equation of the normal equations of multiple regression analysis is mainly used, so this is introduced here.
Figure 2021068456

特許文献1
特許第5120820号「”多重共線性“を解消する回帰分析によって適正な標準宅地の評価を行う装置」
”多重共線性”の解決は、正規方程式の行列表示で、左右両辺の項目数値のアンバランスをいかに合理的に抑えるかにかかっている。この特許は説明変数が2変数の場合で、目的変数である地価の媒介による説明変数相互の比較的高い相関部分を次のように除去して、行列第1行の非対角要素2項を構成している。
r12=r12(1−R1R2)
これは個々の事例についてではなく、一連の事例に対してR1R2の数値を一括除去する方法である。この除去により、それなりに”多重共線性“を解消する効果を期待できるが大まかな計算である。
特許文献2
特許第5585921号「産業上の成果である売上高等と原因である販売経費等の関係分析に”多重共線性“解消の重回帰分析を適用して解明する装置」
この特許の”多重共線性“解消は[特許文献1]の除去に比して、個別の事例を対象としたより丁寧な除去であるが、基準化した説明変数そのものに対する除去であるため、事例を基準化変数の正と負の符号同調事例に分類し、それぞれについて行うもので、一貫性がなく、求めた相関行列も非対称行列になっている。
Patent Document 1
Patent No. 5120820 "A device that evaluates an appropriate standard residential land by regression analysis that eliminates" multicollinearity ""
The solution of "multicollinearity" depends on how to reasonably suppress the imbalance of the item values on both the left and right sides in the matrix display of the normal equations. In this patent, when the explanatory variables are two variables, the relatively high correlation between the explanatory variables mediated by the land price, which is the objective variable, is removed as follows, and the off-diagonal element 2 term in the first row of the matrix is removed. It is configured.
r12 = r12 (1-R1R2)
This is a method of collectively removing the numerical values of R1R2 for a series of cases, not for individual cases. This removal can be expected to have the effect of eliminating "multicollinearity" as it is, but it is a rough calculation.
Patent Document 2
Patent No. 5585921 "A device that applies multiple regression analysis to eliminate" multicollinearity "to analyze the relationship between sales, etc., which is an industrial achievement, and sales expenses, etc., which is the cause."
The elimination of "multicollinearity" in this patent is a more polite removal for individual cases than the removal in [Patent Document 1], but it is a removal for the standardized explanatory variable itself, so it is a case. Is classified into positive and negative sign tuning cases of the standardized variable, and is performed for each case. It is inconsistent and the obtained correlation matrix is also an asymmetric matrix.

”多重共線性“問題を回避するために一般に次のような方法が考えられている。
相関の高い説明変数のどちらか一つの削除
主成分分析
リッジ回帰
説明変数の削除は成果である売上高に対して有意に選択された原因である従業員数や販売経費を省くのは適切でない。
主成分分析はP個の変数X1,X2,..Xpをある条件の下にp>mの綜合変数j(j=1〜m)に要約する方法で、説明変数の係数biを求めるものではない。
リッジ回帰は相関行列の対角要素に一定数kを加えて偏回帰係数を求めるもので、概要を次に示す。

Figure 2021068456
これは、非対角要素の相関係数r12の数値を小さくする効果があり、その分だけ“多重共線性”が解消されるが、相関行列の対角要素に加える加算数値の選択に確たる基準がなく、恣意的であまり利用されていないと思う。
Arthur E.Hoer and Robert,W.Kennard“Ridge regression”1970年 2月 The following methods are generally considered to avoid the "multicollinearity" problem.
Deletion of one of the highly correlated explanatory variables Principal component analysis Ridge regression Deletion of explanatory variables is not appropriate to omit the number of employees and sales expenses that are the significantly selected causes for the resulting sales.
Principal component analysis is performed on P variables X1, X2. .. The coefficient bi of the explanatory variable is not obtained by the method of summarizing Xp into the integrated variable j (j = 1 to m) of p> m under a certain condition.
Ridge regression obtains the partial regression coefficient by adding a certain number of k to the diagonal elements of the correlation matrix, and the outline is shown below.
Figure 2021068456
This has the effect of reducing the value of the correlation coefficient r12 of the off-diagonal element, and the "multicollinearity" is eliminated by that amount, but it is a reliable criterion for selecting the additional value to be added to the diagonal element of the correlation matrix. I don't think it's used, it's arbitrary and it's not used much.
Arthur E. Hoer and Robert, W. et al. Kennard "Ridge regression" February 1970

リッジ回帰の”多重共線性“解消は対角要素を対象として非対角要素数値を小さくする間接的方法であるのに対して、本発明は相関行列の非対角要素を直接の分析対象とする。While the "multicollinearity" elimination of ridge regression is an indirect method of reducing the off-diagonal element values for diagonal elements, the present invention uses the off-diagonal elements of the correlation matrix as the direct analysis target. To do.

数式(4)の第1行をみれば、左辺は原因変数X1とX2との相関係数の和で、(1+r12)は、変数選択の経緯からr12は0より大きい正数と考えられるから1より大きい数値となり、右辺のR1(1>R1>0)より数値が大となる。だから、左辺、右辺の等号を満足させるには次のような方法が考えられる。
第1の方法は現行の回帰分析の手法で、いわば強引に左右両辺の等号を図る方法と言える。
多重共線性“問題を発生させる。
第2の方法は、数式(4)が原因変数2ケ以上で、左辺各項の原因変数値に重複が予想される場合に、その重複分を除去するものである。この場合でもなお左右両辺の数値は残るが大幅に緩和される。この操作の適否は、除去方法が理論的、実務的に合理的であるか否かによる。左辺第1行の最初の項はX1の相乗であり、右辺はまた目的変数YとX1だけの相関なので、この行に関してX2は従属的立場の変数である。だから、変数間の重複分を除去するとすれば主力的なXiを基準に従属的立場のX2から除去するのが適切と考える。
Looking at the first line of equation (4), the left side is the sum of the correlation coefficients of the causative variables X1 and X2, and (1 + r12) is because r12 is considered to be a positive number greater than 0 from the background of variable selection. The value is larger than R1 (1>R1> 0) on the right side. Therefore, the following methods can be considered to satisfy the equal signs on the left and right sides.
The first method is the current regression analysis method, which can be said to be a method of forcibly aiming for equal signs on both the left and right sides.
Multicollinearity "causes problems.
The second method is to remove the duplication when the formula (4) has two or more causal variables and the causal variable values of each term on the left side are expected to be duplicated. Even in this case, the values on both the left and right sides remain, but they are greatly relaxed. The suitability of this operation depends on whether the removal method is theoretically and practically rational. Since the first term of the first row on the left side is the synergy of X1 and the right side is also the correlation of only the objective variables Y and X1, X2 is a variable in the dependent position with respect to this row. Therefore, if we want to remove the duplication between variables, we think that it is appropriate to remove it from the subordinate X2 based on the main Xi.

1 基準化変数Aiは、これに一定数を加算、乗算しても元のAiの数値は不変である。
基準化式(3)参照
2 ΣA1A2/n=r12 である。
3 Aiの数値のままでの重複分除去は、負数がある場合は計算が複雑になるので、除去
計算で全て正数になるように、又、全てのAi数値が統一された数値となるように一定
数(本実施例ではAの負数最大値を目途に.1.8)を加算する。
4 左辺(対角要素A1、非対角要素A2)について、A2からA1と同調する部分を下記の通り除去するため、A2とA1の絶対値のどちらか低い数値を限度としてA2から除去し、残余分を求める。
する除去例 A1 A2 除去分 残余分
0.3 0.2 0.2 0
0.2 0.5 0.2 0.3
なお、上記3の一定数加算で実施例のA1,A2の数値総和は18と等しくなる。
このことから、次の簡単な例で理解できるように正規方程式は、正方行列で斜線状に位置する対角要素の対称の位置にある非対角要素の数値は等しくなる。
A1、A2の数値総和を等しく0.7とする。
A1が対角要素(除去基準)でA2が除去される非対角要素である場合
A1 A2 除去分 残余分
0.3 0.5 0.3 0.2
0.4 0.2 0.2 0
逆の場合
A2 A1 除去分 残余分
0.5 0.3 0.3 0
0.2 0.4 0.2 0.2
このため正規方程式の相関行列の斜線状の対角要素を基準とした対称の位置にある非対
角要素の数値は等しい。
5 除去残余分はA2の一部なので、これをA1*A2(これは重複分除去前の相関係数
r12である。数式(4)参照)に乗じ、改めてr12を求めて重複分除去後の新相関
行列を作り、重複分除去後のbi(i=1,2)を求める。
1 Even if a certain number of the standardized variable Ai is added or multiplied, the original numerical value of Ai does not change.
See standardization formula (3) 2 ΣA1A2 / n = r12.
3 When removing duplicates with Ai values as they are, the calculation becomes complicated when there are negative numbers, so make sure that all Ai values are positive in the removal calculation, and that all Ai values are unified. Add a certain number (1.8 in this example, aiming at the maximum negative number of A).
4 Regarding the left side (diagonal element A1, off-diagonal element A2), in order to remove the part synchronized with A1 from A2 as follows, remove from A2 up to the lower of the absolute values of A2 and A1. Find the remainder.
Removal example A1 A2 Removal residual residue
0.3 0.2 0.2 0
0.2 0.5 0.2 0.3
It should be noted that the sum of the numerical values of A1 and A2 in the embodiment becomes equal to 18 by adding one constant of 3 above.
From this, as can be understood in the following simple example, in the normal equation, the numerical values of the off-diagonal elements at the symmetrical positions of the diagonal elements located diagonally in the square matrix are equal.
Let the sum of the numerical values of A1 and A2 be equal to 0.7.
When A1 is a diagonal element (removal standard) and A2 is an off-diagonal element to be removed A1 A2 Removal residual amount 0.3 0.5 0.3 0.2
0.4 0.2 0.2 0
In the opposite case A2 A1 Removal amount Residual 0.5 0.3 0.3 0
0.2 0.4 0.2 0.2
Therefore, the numerical values of the off-diagonal elements at symmetrical positions with respect to the diagonal diagonal elements of the correlation matrix of the normal equation are equal.
5 Since the removal residue is a part of A2, this is multiplied by A1 * A2 (this is the correlation coefficient r12 before removing the duplicates. See formula (4)), and r12 is calculated again to obtain r12 after the duplicates are removed. Create a new correlation matrix and find bi (i = 1, 2) after removing the duplicates.

求めたbiの妥当性を確認するため、目的変数Yに対する説明変数の元の最高相関係数を100とした場合のbiの相対比を求めると、元の相関数値に近く,“多重共線性”等が解消されていることが分る。In order to confirm the validity of the obtained bi, the relative ratio of bi when the original maximum correlation coefficient of the explanatory variable with respect to the objective variable Y is set to 100 is close to the original correlation value, and "multicollinearity". It can be seen that such things have been resolved.

実施手順Implementation procedure

1 多重回帰分析の正規方程式の目的変数Y,説明変数Xiを、説明変数の計測単位によ
る変動を抑制するために、また、正規方程式の相関行列を作成するために各変数の基
準化を行う。
2 基準化変数の一定数加算.乗算により数値が変化しない特性を利用する。
3 正規方程式の各行の数値ために、また、重複分除去により負数を生じさせないように
するために基準化変数に一定数の加算をする。
4 対角である主力変数を基準とした非対角要素の従属変数の重複分除去は[0012]
の計算を適用する。
5 求められた残余の数値は行列の非対角要素である従属変数の一部なので[0012]
、改めて正規方程式の非対角要素の相関係数とする新行列式を作る。
6 以上の演算よる正規方程式の新行列から、目的変数Yへの寄与度を示すbiを求める

7 ここまでの“多重共線性”等解消の演算方法が適切である否かを検証するために、同
一事例に対して説明変数を順次追加させた場合の当該方法の妥当性を検証する。これ
を見ると説明変数4ケの場合に適用した“多重共線性”等の解消の演算方法の結果は
、元の目的変数に対する説明変数の相関係数の相対比格差に近く概ね妥当である。
1 The objective variable Y and the explanatory variable Xi of the normal equation of the multiple regression analysis are standardized in order to suppress the fluctuation of the explanatory variable due to the measurement unit and to create the correlation matrix of the normal equation. I do.
2 Add a constant of standardized variables. Use the characteristic that the numerical value does not change by multiplication.
3 Add a certain number to the reference variable for the numerical value of each line of the normal equation and to prevent the negative number from being generated by the duplication removal.
4 Deduplication of dependent variables of off-diagonal elements based on diagonal main variables is [0012].
Apply the calculation of.
5 Since the obtained residual numerical value is a part of the dependent variable which is an off-diagonal element of the matrix, [0012]
, Create a new determinant that is the correlation coefficient of the off-diagonal elements of the normal equation.
6 Find the bi indicating the degree of contribution to the objective variable Y from the new matrix of the normal equations obtained by the above operations.
7 In order to verify whether the calculation method for eliminating "multicollinearity" etc. up to this point is appropriate, the validity of the method when explanatory variables are sequentially added to the same case is verified. Looking at this, the result of the calculation method for eliminating "multicollinearity" applied in the case of four explanatory variables is close to the relative ratio difference of the correlation coefficient of the explanatory variable with respect to the original objective variable, and is generally valid.

実施例1
以下の目的変数Y,説明変数Xh,Xi,Xj,Xk,Xlは適当に設定した数値です。計算を簡単にするため各変数相互の相関係数を求める。

Figure 2021068456
実施例2 2変数、3変数、4変数の場合
Figure 2021068456
Figure 2021068456
Figure 2021068456
Figure 2021068456
計算結果は採用変数2変数、追加しiた3変数、4変数の全ての場合で、現行分析方法で求めたbi数値は乱れているが、重複分除去の分析ではいずれの場合もR値の相対比に比較的近い結果得ている。
実施例3
Yに対する相関係数が0.5レベルの事例xj,Xk,Xlの3事例についてのbiを求める回帰分析である。
Figure 2021068456
計算結果は現行分析方法のb値は乱れているが、重複分除去の計算でほぼR値に近い。Example 1
The following objective variables Y and explanatory variables Xh, Xi, Xj, Xk, and Xl are numerical values set appropriately. Find the correlation coefficient between each variable to simplify the calculation.
Figure 2021068456
Example 2 In the case of 2 variables, 3 variables and 4 variables
Figure 2021068456
Figure 2021068456
Figure 2021068456
Figure 2021068456
The calculation results are in all cases of 2 variables adopted, 3 variables added, and 4 variables, and the bi value obtained by the current analysis method is distorted, but in the analysis of duplication removal, the R value is the R value in each case. The results are relatively close to the relative ratio.
Example 3
This is a regression analysis for obtaining bi for three cases xj, Xk, and Xl in which the correlation coefficient with respect to Y is 0.5 level.
Figure 2021068456
Although the b value of the current analysis method is disturbed in the calculation result, it is close to the R value in the calculation of duplication removal.

Claims (1)

売上高等を目的変数Yとし、販売経費等を説明変数Xiとする重回帰分析の正規方程式、Y,Xiを基準化変数(A=(Xi−Xi)/σi,Xi:平均、σi:標準偏差、Yも同様)とする正規方程式の相関行列式を作成し、の基準化変数を数値の統一と重複分除去で負数を生じないよう一定数を加算した行列の対角要素を基準変数、非対角要素を従属変数として、従属変数から基準変数との重複分を除去し、残余を改めて正規方程式の相関行列式の非対角要素とした新行列式からYへの寄与度を示す偏回帰係数biを求める”多重共線性“等を解消する計算技術である。A normal equation for multiple regression analysis with sales as the objective variable Y and sales expenses as the explanatory variable Xi, and standardized variables Y and Xi (A = ( Xi-Xi ) / σi, Xi: mean, σi: standard deviation , Y is also the same), and the reference variable of the normal equation is the reference variable, the diagonal element of the matrix in which a certain number is added so as not to generate a negative number by unifying the numerical values and removing the duplication. Partial regression showing the contribution to Y from the new matrix expression with the diagonal element as the dependent variable, removing the overlap with the reference variable from the dependent variable, and again using the remainder as the off-diagonal element of the correlation matrix expression of the normal equation. This is a calculation technique that eliminates "multiple co-linearity" and the like for obtaining the coefficient bi.
JP2020183948A 2020-10-15 2020-10-15 Calculation technique for eliminating "multicollinearity" or the like in regression analysis, and obtaining partial regression coefficient indicating contribution to proper objective variable of explanatory variable, as management material Pending JP2021068456A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020183948A JP2021068456A (en) 2020-10-15 2020-10-15 Calculation technique for eliminating "multicollinearity" or the like in regression analysis, and obtaining partial regression coefficient indicating contribution to proper objective variable of explanatory variable, as management material

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2020183948A JP2021068456A (en) 2020-10-15 2020-10-15 Calculation technique for eliminating "multicollinearity" or the like in regression analysis, and obtaining partial regression coefficient indicating contribution to proper objective variable of explanatory variable, as management material

Publications (2)

Publication Number Publication Date
JP2021068456A true JP2021068456A (en) 2021-04-30
JP2021068456A5 JP2021068456A5 (en) 2021-07-29

Family

ID=75637388

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2020183948A Pending JP2021068456A (en) 2020-10-15 2020-10-15 Calculation technique for eliminating "multicollinearity" or the like in regression analysis, and obtaining partial regression coefficient indicating contribution to proper objective variable of explanatory variable, as management material

Country Status (1)

Country Link
JP (1) JP2021068456A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115545216A (en) * 2022-10-19 2022-12-30 上海零数众合信息科技有限公司 Service index prediction method, device, equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115545216A (en) * 2022-10-19 2022-12-30 上海零数众合信息科技有限公司 Service index prediction method, device, equipment and storage medium
CN115545216B (en) * 2022-10-19 2023-06-30 上海零数众合信息科技有限公司 Service index prediction method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
Weil Are mandated health and safety committees substitutes for or supplements to labor unions?
Liao et al. Economic production quantity model for randomly failing production process with minimal repair and imperfect maintenance
US8015137B2 (en) Determining the degree of relevance of alerts in an entity resolution system over alert disposition lifecycle
US8250637B2 (en) Determining the degree of relevance of duplicate alerts in an entity resolution system
US20140324519A1 (en) Operational Risk Decision-Making Framework
US20100161634A1 (en) Best-value determination rules for an entity resolution system
US20090271348A1 (en) Determining the degree of relevance of alerts in an entity resolution system
Li et al. Designing lean processes with improved service quality: An application in financial services
Miller et al. Do auditors assess inherent risk as if there are no controls?
JP2021068456A (en) Calculation technique for eliminating "multicollinearity" or the like in regression analysis, and obtaining partial regression coefficient indicating contribution to proper objective variable of explanatory variable, as management material
US20070179831A1 (en) Systems and methods of managing assignments
JP2021068456A5 (en)
UcuNugraha Implementation of ISO 31000 for information technology risk management in the government environment
KR100891345B1 (en) Information security managment system supporting inter-mapping between each different information security index and method thereof
US20070194920A1 (en) Focused alarm rationalization method
JP6062836B2 (en) Expense apportioning system and method
Lai et al. Development of a failure mode and effects analysis based risk assessment tool for information security
Sivaretinamohan et al. Behavioural Intention towards adoption of Robotic Accounting for a profitable leading digital transformation
Mehta et al. Applications of combinatorial testing methods for breakthrough results in software testing
Daugherty et al. Offshoring audit tasks and jurors' evaluations of damage awards against auditors
Nitu et al. ISO 9004 and risk management in practice
Umehara et al. Proposal for combinatorial optimization technology in consideration of the dynamic characteristic of IT risks
Aksentijević et al. Application of social network analysis to port community systems
WO2018090055A1 (en) System and method for software adoption
Muhammad et al. Development of quality improvement matrix: An integrated tools for quality improvement

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20201015

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20201126

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20201124

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20210222

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20210322

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20210906

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20211026

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20220920