WO2015145979A1 - 価格推定装置、価格推定方法、及び、記録媒体 - Google Patents
価格推定装置、価格推定方法、及び、記録媒体 Download PDFInfo
- Publication number
- WO2015145979A1 WO2015145979A1 PCT/JP2015/001024 JP2015001024W WO2015145979A1 WO 2015145979 A1 WO2015145979 A1 WO 2015145979A1 JP 2015001024 W JP2015001024 W JP 2015001024W WO 2015145979 A1 WO2015145979 A1 WO 2015145979A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- price
- information
- component
- hierarchical
- model
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
- G06Q30/0206—Price or cost determination based on market factors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N7/00—Computing arrangements based on specific mathematical models
- G06N7/01—Probabilistic graphical models, e.g. probabilistic networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
- G06Q30/0202—Market predictions or forecasting for commercial activities
Definitions
- the present invention relates to a price estimation device, a price estimation method, a recording medium, and the like.
- prices related to objects such as used buildings, used cars, and used equipment vary depending on the service life, whether there is a failure, the frequency of maintenance, the degree of wear, and the like.
- the correlation between the value of the factor and the price is analyzed by analyzing the statistical data in which the value of the factor that can affect the price of the object such as the useful life and the price is associated. Further, based on the result of the analysis, it is estimated how much the price of a certain object is.
- Patent Document 1 discloses a residual value prediction system that predicts a residual value related to an object.
- the residual value prediction system has a used price database that stores an elapsed time related to an object and a used distribution price related to the object (or a ratio of the used distribution price to a new price).
- the residual value prediction system obtains a function that associates the elapsed time with the used distribution price based on the used price database.
- the residual value prediction system estimates a used distribution price for the new object by applying the function to the elapsed time for the new object.
- Non-Patent Document 1 as an example of a prediction technique, a perfect marginal likelihood function is approximated to a mixed model that is a representative example of a hidden variable model, and its lower bound (lower limit) is maximized.
- a method for determining the type of observation probability is disclosed.
- the residual value prediction system disclosed in Patent Document 1 does not necessarily have high prediction accuracy.
- an object of the present invention is to provide a price estimation device, a price estimation method, a recording medium, and the like that can predict a price.
- the price estimation device includes: Prediction data input means for inputting prediction data that is one or more explanatory variables that can affect the price; One or more nodes are arranged in each hierarchy, and hidden variables are represented by a hierarchical structure having a path between a node arranged in the first hierarchy and a node arranged in the lower second hierarchy, and the hierarchical structure A hierarchical hidden structure that is a structure in which a component representing a probability model is arranged at a node in the lowest layer of the layer, and a gate that is a base for determining the path between nodes constituting the hierarchical hidden structure when the component is determined.
- Component determination means for determining the component used for the price prediction based on the function model and the prediction data
- Price prediction means for predicting the price based on the component determined by the component determination means and the prediction data.
- the price estimation method is as follows:
- the information processing apparatus inputs prediction data that is one or more explanatory variables that can affect the price, one or more nodes are arranged in each hierarchy, a node arranged in the first hierarchy, and a lower second A hidden layer structure in which a hidden variable is represented by a hierarchical structure having a path between nodes arranged in a hierarchy, and a component representing a probability model is arranged in a node in the lowest layer of the hierarchical structure, and the component Determining the component to be used for the price prediction based on the gate function model that is the basis for determining the path between the nodes constituting the hierarchical hidden structure and the prediction data, The price is predicted based on the component determined by the component determination means and the prediction data.
- this object is also realized by such a price prediction program and a computer-readable recording medium for recording the program.
- the price can be predicted with higher accuracy.
- FIG. 1 It is a block diagram which shows the structural example of the optimization process part of the gate function model which concerns on at least 1 embodiment of this invention. It is a flowchart which shows the operation example of the optimization process part of the gate function model which concerns on at least 1 embodiment of this invention. It is a block diagram which shows the basic composition of the estimation apparatus of the hierarchical hidden variable model which concerns on at least 1 embodiment of this invention. It is a block diagram which shows the basic composition of the price estimation apparatus which concerns on at least 1 embodiment of this invention. It is a schematic block diagram which shows the structure of the computer which concerns on at least 1 embodiment of this invention. It is a block diagram which shows the structure which the estimation apparatus which concerns on the 4th Embodiment of this invention has. FIG.
- FIG. 3 is a diagram conceptually illustrating an example of a first information set according to at least one embodiment of the present invention. It is a block diagram which shows the structure which the price estimation apparatus which concerns on the 5th Embodiment of this invention has. It is a figure showing an example of the gate function model and component which concern on at least 1 embodiment of this invention. It is a block diagram which shows the structure which the price estimation apparatus which concerns on the 6th Embodiment of this invention has. It is a flowchart which shows the flow of a process in the price estimation apparatus which concerns on 6th Embodiment. It is a block diagram which shows the structure which the price estimation apparatus which concerns on the 7th Embodiment of this invention has. It is a flowchart which shows the flow of a process in the price estimation apparatus which concerns on 7th Embodiment.
- Non-Patent Document 1 Even if the method described in Non-Patent Document 1 is applied to price prediction, there is a problem that the model selection problem of models including hierarchical hidden variables cannot be solved.
- Non-Patent Document 1 does not take into account the hierarchical hidden variables, so it is obvious that a calculation procedure cannot be constructed.
- the method described in Non-Patent Document 1 is based on a strong assumption that it cannot be applied when there are hierarchical hidden variables. Therefore, when this method is simply applied to price prediction, it is theoretical. This is because the legitimacy is lost.
- Patent Document 1 has a problem that the prediction accuracy is not always high.
- the residual value prediction system obtains a function by applying an exponential function or the like to each used car classified according to color or the like.
- the processing procedure adopted by the residual value prediction system is a predetermined procedure and is not necessarily an optimal procedure for predicting a used distribution price. Therefore, the function obtained by the residual value prediction system does not necessarily sufficiently explain the used distribution price.
- the residual value prediction system disclosed in Patent Literature 1 cannot automatically find out what classification is optimal, and can assign an optimal expression to the found classification. Can not.
- the residual value prediction system has a problem that the accuracy deteriorates when the optimum classification method is different for each vehicle type or when the optimum expression is different for each classified object.
- the residual value prediction system When using the residual value prediction system, in order to reduce the accuracy degradation, it is necessary to find an optimal classification by trial and error for every classification. However, when the residual value prediction system is used, it takes a lot of time and effort to find an optimal classification by trial and error. That is, the residual value prediction system also has a problem that it takes a great deal of time to find the optimum classification.
- the present applicant has found such a problem and derived means for solving the problem.
- the prices to be predicted are, for example, used buildings, used cars, used equipment, It is the price of used game machines and used clothes. Further, the price to be predicted is, for example, a purchase price purchased by an intermediary that mediates buying and selling, or a sales price in the case of selling by an intermediary.
- the price related to the used device is predicted.
- the target to be predicted is not limited to the price of used equipment.
- the learning database contains multiple sets of data on used equipment and prices.
- a hierarchical hidden variable model represents a probability model in which hidden variables have a hierarchical structure (for example, a tree structure). Components that are probabilistic models are assigned to the nodes in the lowest layer of the hierarchical hidden variable model.
- a node function other than the node in the lowest layer is a gate function (a reference function for selecting a node according to input information)
- a gate function model A gate function model.
- the price estimation device will be described with reference to a hierarchical hidden variable model having two layers as an example.
- the hierarchical structure is a tree structure.
- the hierarchical structure does not necessarily have to be a tree structure.
- the route from the root node (root node) to a certain node is determined as one.
- a route (link) from a root node to a certain node is referred to as a “route”.
- the route hidden variable is determined by tracing the hidden variable for each route. For example, the route hidden variable in the lowest layer represents a route hidden variable determined for each route from the root node to the node in the lowest layer.
- the data string xn may be referred to as an observation variable.
- i n in the lowermost layer, and a path hidden variable z ij n in the lowermost layer are defined for the observation variable x n .
- i n 0 represents that in the first layer This represents that xn input to the i-th node does not branch to the j-th node in the second layer.
- ⁇ i z i n 1 , ⁇ j z j
- the combination of x and the representative value z of the path hidden variable z ij n in the lowest layer is called a “perfect variable”.
- x is called an “incomplete variable”.
- Equation 1 A hierarchical hidden variable model simultaneous distribution having a depth of 2 for a complete variable is expressed by Equation 1.
- the representative value of z i n a z 1st n, z j
- the variation distribution for the branch hidden variable z i n in the first layer is q (z i n )
- the variation distribution for the path hidden variable z ij n in the lowest layer is q (z ij n ).
- K 1 represents the number of nodes in the first layer
- K 2 represents the number of nodes branched from each node in the first layer.
- the number of components in the lowest layer is represented by K 1 ⁇ K 2 .
- ⁇ ( ⁇ , ⁇ 1 ,..., ⁇ K1 , ⁇ 1 ,..., ⁇ K1 ⁇ K2 ) represents a model parameter.
- ⁇ represents the branch parameter of the root node.
- ⁇ k represents a branch parameter of the k-th node in the first layer.
- ⁇ k represents an observation parameter for the k-th component.
- a hierarchical hidden variable model having a depth of 2 will be described as an example when a specific example is used for description.
- the hierarchical hidden variable model according to at least one embodiment is not limited to the hierarchical hidden variable model having a depth of 2, and is a hierarchical hidden variable model having a depth of 1 or 3 or more. There may be.
- Equation 1 and Equations 2 to 4 described later may be derived, and the estimation device is realized with the same configuration.
- the distribution when the target variable is X will be described.
- the present invention can also be applied to a case where the observation distribution is a conditional model P (Y
- Non-Patent Document 1 a general mixture model is assumed for the probability distribution of the hidden variable that is an indicator of the component, and the optimization criterion is as shown in Equation 10 of Non-Patent Document 1. Derived. However, as can be seen from the fact that the Fisher information matrix is given in the form of Equation 6 of Non-Patent Document 1, in the method described in Non-Patent Document 1, the probability distribution of hidden variables that are component indicators is a mixed model. It is assumed that it depends only on the mixing ratio. Therefore, switching of components according to input cannot be realized, and this optimization criterion is not appropriate.
- FIG. 1 is a block diagram showing an example of the configuration of the price prediction system according to the first embodiment of the present invention.
- the price prediction system 10 includes a hierarchical hidden variable model estimation device 100, a learning database 300, a model database 500, and a price estimation device 700.
- the price prediction system 10 generates a model used for price prediction based on the learning database 300, and performs price prediction using the model.
- the hierarchical hidden variable model estimation apparatus 100 creates a model for estimating a price based on the data in the learning database 300 and stores the created model in the model database 500.
- 2A to 2D are diagrams illustrating examples of information stored in the learning database 300 according to at least one embodiment of the present invention.
- the learning database 300 stores price information in which a price is associated with a factor that may affect the price. As illustrated in FIG. 2A, the price information is stored in association with a device identifier (ID) such as a price, a purchase time, a price measurement time, and the like.
- ID device identifier
- the learning database 300 stores device information in which data related to the device is stored. As shown in FIG. 2B, the device information stores a sale start time, a transportation cost, a service life, a color, a size, a scratch state, a weight, and the like in association with the device ID.
- the learning database 300 stores device configuration information in which data related to attached devices attached to the device is stored. As shown in FIG. 2C, the device configuration information stores a device ID and an attached device ID attached to the device in association with each other.
- the learning database 300 stores accessory device information in which data related to the accessory device is stored. As shown in FIG. 2D, the accessory device information stores a service life, purchase time, next inspection time, and the like in association with the accessory device ID.
- Learning data may be created by combining values included in the learning database 300.
- the learning data may be created by applying a calculation to the values included in the learning database 300.
- the learning data may be created by combining the two operations described above.
- the information included in the learning database is not limited to the example described above.
- the learning database 300 may include, for example, information such as displacement, equipment information, mileage, manufacturer, year of release, number of months of vehicle inspection, model, vehicle type, or grade.
- the learning database 300 may include items other than the items described above, and does not necessarily include all the items described above.
- the learning database 300 includes, for example, the distance from the station, the total floor area, the number of floors, the floor, the distance from the park, the distance from the school, the distance from the supermarket, and the bathroom It may include information such as whether or not it is separate, whether or not there is an auto-lock, whether or not there is an elevator, the size of storage, or a floor plan.
- the learning database 300 may include items other than the items described above, and does not necessarily include all the items described above.
- the model database 500 stores a model used when the price estimated by the hierarchical hidden variable model estimation apparatus 100 is calculated.
- the model database 500 is configured by a tangible medium that is not temporary, such as a hard disk drive or a solid state drive.
- the price estimation apparatus 700 receives information related to the price related to the object, and predicts the price based on the received information and the model stored in the model database 500.
- FIG. 3 is a block diagram illustrating a configuration example of a hierarchical hidden variable model estimation apparatus according to at least one embodiment of the present invention.
- the hierarchical hidden variable model estimation apparatus 100 includes a data input device 101, a hierarchical hidden structure setting unit 102, an initialization processing unit 103, and a calculation process of variation probability of hierarchical hidden variables. Unit 104 and component optimization processing unit 105. Furthermore, the hierarchical hidden variable model estimation device 100 includes a gate function model optimization processing unit 106, an optimality determination processing unit 107, an optimal model selection processing unit 108, and a model estimation result output device. 109.
- the hierarchical hidden variable model estimation apparatus 100 receives the hierarchical hidden structure and the observation probability of the input data 111. Optimize the type. Next, the hierarchical hidden variable model estimation apparatus 100 outputs the optimized result as a model estimation result 112 and records the model estimation result 112 in the model database 500.
- the input data 111 is an example of learning data.
- FIG. 4 is a block diagram illustrating a configuration example of the calculation processing unit 104 of the hierarchical hidden variable variation probability according to at least one embodiment of the present invention.
- the hierarchical hidden variable variation probability calculation processing unit 104 includes a path hidden variable variation probability calculation processing unit 104-1 in the lowest layer, a hierarchy setting unit 104-2, and a path hidden variable variation in the upper layer.
- the hierarchical hidden variable variation probability calculation processing unit 104 receives the hierarchical hidden variable.
- the variation probability 104-6 is output.
- the component in the present embodiment is a value indicating a weight (parameter) related to each explanatory variable.
- the price estimation apparatus 700 can obtain the objective variable by calculating the sum of the explanatory variables multiplied by the weight indicated by the component.
- FIG. 5 is a block diagram showing a configuration example of the gate function model optimization processing unit 106 according to at least one embodiment of the present invention.
- the gate function model optimization processing unit 106 includes a branch node information acquisition unit 106-1, a branch node selection processing unit 106-2, a branch parameter optimization processing unit 106-3, and optimization of all branch nodes. And a determination processing unit 106-4.
- the gate function model optimization processing unit 106 includes the input data 111, the variation probability 104-6 of the hierarchical hidden variable calculated by the calculation processing unit 104 of the hierarchical hidden variable, which will be described later, The estimation model 104-5 estimated by the optimization processing unit 105 is received.
- the gate function model optimization processing unit 106 outputs the gate function model 106-6 in response to receiving the three inputs. A detailed description of the gate function model optimization processing unit 106 will be given later.
- the gate function model in the present embodiment is a function that determines whether information included in the input data 111 satisfies a predetermined condition.
- the gate function model is provided corresponding to the internal node of the hierarchical hidden structure.
- the internal node represents a node other than the node arranged in the lowest layer.
- the data input device 101 is a device for inputting input data 111.
- the data input device 101 extracts an objective variable indicating a price based on the data recorded in the price information in the learning database 300.
- the data input device 101 generates an explanatory variable based on data recorded in price information, device information, device configuration information, attached device information, etc. in the learning database 300. That is, the data input device 101 generates, for each objective variable, one or more explanatory variables that are information that can affect the objective variable. Then, the data input device 101 inputs a plurality of combinations of objective variables and explanatory variables as input data 111. When the input data 111 is input, the data input device 101 also inputs parameters necessary for model estimation, such as the type of observation probability and the number of components. In the present embodiment, the data input device 101 is an example of a learning information input unit.
- the hierarchical hidden structure setting unit 102 selects the structure of the hierarchical hidden variable model that is a candidate for optimization from the input types of observation probabilities and the number of components, and sets the selected structure as an object to be optimized. Set.
- the hidden structure used in this embodiment is, for example, a tree structure. In the following, it is assumed that the set number of components is represented as C, and the mathematical formula used in the description is for a hierarchical hidden variable model having a depth of 2.
- the hierarchical hidden structure setting unit 102 may store the structure of the selected hierarchical hidden variable model in a memory.
- the hierarchical hidden structure setting unit 102 has two nodes in the first layer, the second A node in the layer (in this embodiment, a node in the lowest layer) selects four hierarchical hidden structures.
- the initialization processing unit 103 performs an initialization process for estimating a hierarchical hidden variable model.
- the initialization processing unit 103 can execute initialization processing by various methods. For example, the initialization processing unit 103 may set the type of observation probability at random for each component, and set the parameter of each observation probability at random according to the set type. Further, the initialization processing unit 103 may set the path variation probability at the lowest layer of the hierarchical hidden variable at random.
- the hierarchical hidden variable variation probability calculation processing unit 104 calculates the variation probability of the path hidden variable for each layer.
- the parameter ⁇ is calculated by the initialization processing unit 103, the component optimization processing unit 105, the gate function model optimization processing unit 106, and the like. Therefore, the variation processing probability calculation unit 104 of the hierarchical hidden variable calculates the variation probability based on the value.
- the hierarchical hidden variable variation probability calculation processing unit 104 Laplace approximates the marginal log likelihood function with respect to the estimator for the complete variable (for example, the maximum likelihood estimator or the maximum posterior probability estimator), The variation probability is calculated by maximizing.
- the variation probability calculated in this way is referred to as an optimization criterion A.
- log represents a logarithmic function.
- the base of the logarithmic function is, for example, the Napier number. The same applies to the following expressions.
- Equation 2 the equal sign is established by maximizing the variation probability q (z N ) of the path hidden variable in the lowest layer.
- the marginalized likelihood of the numerator perfect variable is Laplace approximated using the maximum likelihood estimator for the perfect variable, the approximate expression of the marginalized log likelihood function shown in Equation 3 is obtained. ... (Formula 3)
- Equation 3 the superscript bar represents the maximum likelihood estimator for the complete variable, and D * represents the dimension of the subscript parameter *.
- Equation 3 Equation 4. ... (Formula 4)
- the variation distribution q ′ of the branch hidden variable in the first layer and the variation distribution q ′′ of the path hidden variable in the lowermost layer are obtained by maximizing Equation 4 for each variation distribution.
- ⁇ t ⁇ is a hierarchical hidden variable variation probability calculation processing unit 104, a component optimization processing unit 105, a gate function model optimization processing unit 106, and an optimality determination processing unit. This represents the t-th iteration in 107 iterations.
- the variation processing probability calculation unit 104-1 for the path hidden variable in the lowest layer receives the input data 111 and the estimation model 104-5, and calculates the variation probability q (z N ) of the hidden variable in the lowest layer. To do.
- the hierarchy setting unit 104-2 sets that the object whose variation probability is to be calculated is the lowest layer.
- the variation probability calculation unit 104-1 for the path hidden variable in the lowest layer calculates the variation probability of each estimation model 104-5 for the combination of the objective variable and the explanatory variable of the input data 111.
- the variation probability is calculated by comparing the value obtained by substituting the explanatory variable of the input data 111 into the estimation model 104-5 and the value of the objective variable of the input data 111.
- the calculation processing unit 104-3 for the variation probability of the path hidden variable in the upper layer calculates the variation probability of the path hidden variable in the upper layer. Specifically, the calculation processing unit 104-3 for the variation probability of the path hidden variable in the upper layer calculates the sum of the variation probabilities of the hidden variable in the layer having the same branch node as a parent, and increases the value by one. The variation probability of the path hidden variable in the layer.
- the hierarchy calculation end determination processing unit 104-4 determines whether or not the layer for which the variation probability is to be calculated still exists in the upper layer. When it is determined that an upper layer exists, the hierarchy setting unit 104-2 sets one upper layer as a target for which the variation probability is to be calculated. Thereafter, the calculation processing unit 104-3 for the variation probability of the path hidden variable in the upper layer and the determination processing unit 104-4 for the completion of the hierarchy calculation repeat the above-described processing. On the other hand, when it is determined that there is no higher layer, the hierarchy calculation end determination processing unit 104-4 determines that the variation probability of the route hidden variable in all the layers has been calculated.
- the component optimization processing unit 105 optimizes each component model (parameter ⁇ and its type S) with respect to Equation 4, and outputs an optimized estimation model 104-5.
- the component optimization processing unit 105 calculates q and q ′′ by the hierarchical hidden variable variation probability calculation processing unit 104.
- the variation probability q (t) of the route hidden variable in the lowest layer is fixed, and q ′ is fixed to the variation probability of the route hidden variable in the upper layer shown in Expression A.
- the component optimization processing unit 105 calculates a model that maximizes the value of G shown in Equation 4.
- S 1, ⁇ , S K1 ⁇ K2 shall be representative of the kind of observation probability corresponding to phi k.
- candidates that can be S 1 to S K1 ⁇ K2 are a normal distribution, a lognormal distribution, an exponential distribution, or the like.
- candidates that can be S 1 to S K1 ⁇ K2 are a zeroth-order curve, a first-order curve, a second-order curve, or a third-order curve.
- Equation 4 can decompose the optimization function for each component. Therefore, S 1 to S K1 ⁇ K2 and parameters ⁇ 1 to ⁇ K1 ⁇ K2 are set without considering the combination of component types (for example, which type of S 1 to S K1 ⁇ K2 is specified). Can be optimized separately. The ability to optimize in this way is important in this process. Thereby, it is possible to avoid the combination explosion and optimize the component type.
- the branch node information acquisition unit 106-1 extracts the branch node list using the estimation model 104-5 estimated by the component optimization processing unit 105.
- the branch node selection processing unit 106-2 selects one branch node from the extracted list of branch nodes.
- the selected node may be referred to as a selected node.
- the branch parameter optimization processing unit 106-3 uses the input data 111 and the variation probability of the hidden variable regarding the selected node obtained from the variation probability 104-6 of the hierarchical hidden variable to determine the branch parameter of the selected node. Optimize. Note that the branch parameter of the selected node corresponds to the gate function model described above.
- the optimization end determination processing unit 106-4 of all branch nodes determines whether all the branch nodes extracted by the branch node information acquisition unit 106-1 have been optimized. When all the branch nodes are optimized, the gate function model optimization processing unit 106 ends the processing here. On the other hand, if there is a branch node that has not been optimized, the branch node selection processing unit 106-2 performs processing. Thereafter, the branch parameter optimization processing unit 106-3 and the optimization end of all branch nodes are completed. The determination processing unit 106-4 is similarly performed.
- a gate function based on the Bernoulli distribution may be referred to as a Bernoulli type gate function.
- the d-th dimension of x is represented as xd .
- the probability of branching to the lower left of the binary tree when this value does not exceed a certain threshold value w is expressed as g ⁇ .
- the probability of branching to the lower left of the binary tree when the threshold value w is exceeded is represented as g + .
- the branch parameter optimization processing unit 106-3 optimizes the optimization parameters d, w, g ⁇ , and g + based on the Bernoulli distribution. This is different from the optimization based on the logit function described in Non-Patent Document 1, and each parameter has an analytical solution, so that higher-speed optimization is possible.
- the optimality determination processing unit 107 determines whether or not the optimization criterion A calculated using Expression 4 has converged. If not converged, processing by the hierarchical hidden variable variation probability calculation processing unit 104, component optimization processing unit 105, gate function model optimization processing unit 106, and optimality determination processing unit 107 Is repeated. Optimality determination processing unit 107 may determine that optimization criterion A has converged, for example, when the increment of optimization criterion A is less than a predetermined threshold.
- the processing by the calculation processing unit 104 for the variation probability of the hierarchical hidden variable, the component optimization processing unit 105, the gate function model optimization processing unit 106, and the optimality determination processing unit 107 are summarized. It may be described as the first process. By repeating the first process and updating the variation distribution and model, an appropriate model can be selected. By repeating these processes, it is guaranteed that the optimization criterion A increases monotonously.
- the optimal model selection processing unit 108 selects an optimal model. Specifically, when the optimization criterion A calculated in the first process is larger than the set optimization criterion A with respect to the number of hidden states set by the setting unit 102 of the hierarchical hidden structure, the optimal The model selection processing unit 108 selects the model as an optimal model.
- the model estimation result output device 109 displays the optimal hidden state when the model optimization is completed for the hierarchical hidden variable model structure candidate set from the input types of observation probability and the number of components.
- the number, type of observation probability, parameter, variation distribution, etc. are output as model estimation results 112.
- the processing is moved to the setting unit 102 of the hierarchical hidden structure, and the above-described processing is similarly performed.
- Each unit to be described later is realized by a central processing unit (Central_Processing_Unit, CPU) of a computer that operates according to a program (a hierarchical hidden variable model estimation program). That is, -Hierarchical hidden structure setting unit 102, Initialization processing unit 103, The hierarchical hidden variable variation probability calculation processing unit 104 (more specifically, the path hidden variable variation probability calculation processing unit 104-1 in the lowest layer, the hierarchy setting unit 104-2, and the upper layer route The hidden variable variation probability calculation processing unit 104-3 and the hierarchical calculation end determination processing unit 104-4), Component optimization processing unit 105, Gate function model optimization processing unit 106 (more specifically, branch node information acquisition unit 106-1, branch node selection processing unit 106-2, branch parameter optimization processing unit 106-3, Branch node optimization end determination processing unit 106-4), Optimality determination processing unit 107, Optimal model selection processing unit 108.
- CPU Central_Processing_Unit, CPU
- the program may be stored in a storage unit (not shown) in the hierarchical hidden variable model estimation apparatus 100, and the CPU may read the program and operate as each unit described later according to the program. That is, -Hierarchical hidden structure setting unit 102, Initialization processing unit 103, The hierarchical hidden variable variation probability calculation processing unit 104 (more specifically, the path hidden variable variation probability calculation processing unit 104-1 in the lowest layer, the hierarchy setting unit 104-2, and the upper layer route The hidden variable variation probability calculation processing unit 104-3 and the hierarchical calculation end determination processing unit 104-4), Component optimization processing unit 105, Gate function model optimization processing unit 106 (more specifically, branch node information acquisition unit 106-1, branch node selection processing unit 106-2, branch parameter optimization processing unit 106-3, Branch node optimization end determination processing unit 106-4), Optimality determination processing unit 107, Optimal model selection processing unit 108.
- -Hierarchical hidden structure setting unit 102 The hierarchical hidden variable variation probability calculation processing unit 104 (more specifically, the path hidden variable variation
- each unit described below may be realized by dedicated hardware. That is, -Hierarchical hidden structure setting unit 102, Initialization processing unit 103, A calculation processing unit 104 for the variation probability of the hierarchical hidden variable, Component optimization processing unit 105, -Gate function model optimization processing unit 106, Optimality determination processing unit 107, Optimal model selection processing unit 108.
- FIG. 6 is a flowchart illustrating an operation example of the hierarchical hidden variable model estimation apparatus according to at least one embodiment of the present invention.
- the data input device 101 inputs the input data 111 (step S100).
- the hierarchical hidden structure setting unit 102 selects a hierarchical hidden structure that has not been optimized from the input candidate values of the hierarchical hidden structure, and sets the selected structure as a target to be optimized. (Step S101).
- the initialization processing unit 103 performs initialization processing of the parameters used for estimation and the variation probability of the hidden variable for the set hierarchical hidden structure (step S102).
- the hierarchical hidden variable variation probability calculation processing unit 104 calculates the variation probability of each path hidden variable (step S103).
- the component optimization processing unit 105 optimizes the component by estimating the type and parameter of the observation probability for each component (step S104).
- the gate function model optimization processing unit 106 optimizes the branch parameters in each branch node (step S105).
- the optimality determination processing unit 107 determines whether or not the optimization criterion A has converged (step S106). That is, the optimality determination processing unit 107 determines the optimality of the model.
- Step S106 when it is not determined that the optimization criterion A has converged (that is, when it is determined that it is not optimal) (No in Step S106a), the processing from Step S103 to Step S106 is repeated.
- step S106 determines whether the optimization criterion A has converged (that is, if it is determined to be optimal) (Yes in step S106a).
- the optimal model selection processing unit 108 is set.
- the optimization standard A based on the optimal model (for example, the number of components, the type of observation probability, and the parameter) is compared with the value of the optimization standard A based on the model set as the optimal model. It selects as an optimal model (step S107).
- the optimum model selection processing unit 108 determines whether or not a candidate for the hidden hierarchical structure that has not been estimated remains (step S108). When candidates remain (Yes in step S108), the processing from step S101 to step S108 is repeated. On the other hand, if no candidate remains (No in step S108), the model estimation result output device 109 outputs the model estimation result, and the process is completed (step S109).
- the model estimation result output device 109 stores the component optimized by the component optimization processing unit 105 and the gate function model optimized by the gate function model optimization processing unit 106 in the model database 500.
- FIG. 7 is a flowchart showing an example of the operation of the hierarchical hidden variable variation probability calculation processing unit 104 according to at least one embodiment of the present invention.
- the variation probability calculation unit 104-1 of the route hidden variable in the lowest layer calculates the variation probability of the route hidden variable in the lowest layer (step S111).
- the hierarchy setting unit 104-2 sets to which level the path hidden variable has been calculated (step S112).
- the variation processing probability 104-3 of the path hidden variable in the upper layer uses the variation probability of the path hidden variable in the layer set by the hierarchy setting unit 104-2.
- the variation probability of the route hidden variable is calculated (step S113).
- the hierarchy calculation end determination processing unit 104-4 determines whether or not there is a layer for which a route hidden variable has not been calculated (step S114). When a layer for which the route hidden variable is not calculated remains (No in step S114), the processing from step S112 to step S113 is repeated. On the other hand, when there is no layer in which the path hidden variable is not calculated (Yes in step S114), the hierarchical hidden variable variation probability calculation processing unit 104 completes the process.
- FIG. 8 is a flowchart showing an operation example of the gate function model optimization processing unit 106 according to at least one embodiment of the present invention.
- the branch node information acquisition unit 106-1 grasps all branch nodes (step S121).
- the branch node selection processing unit 106-2 selects one branch node to be optimized (step S122).
- the branch parameter optimization processing unit 106-3 optimizes the branch parameter in the selected branch node (step S123).
- step S124 the optimization end determination processing unit 106-4 of all branch nodes determines whether or not a branch node that is not optimized remains (step S124).
- branch nodes that are not optimized remain No in step S124
- the processing from step S122 to step S123 is repeated.
- the gate function model optimization processing unit 106 completes the process.
- the hierarchical hidden structure setting unit 102 sets the hierarchical hidden structure.
- the hierarchical hidden structure is a structure in which hidden variables are represented by a hierarchical structure (tree structure) and components representing a probability model are arranged at nodes in the lowest layer of the hierarchical structure.
- the hierarchical structure represents a structure in which one or more nodes are arranged in each hierarchy, and a path is provided between the nodes arranged in the first hierarchy and the nodes arranged in the lower second hierarchy.
- the hierarchical hidden variable variation probability calculation processing unit 104 calculates the variation probability of the path hidden variable (that is, the optimization criterion A).
- the hierarchical hidden variable variation probability calculation processing unit 104 may calculate the hidden variable variation probability for each layer of the hierarchical structure in order from the node in the lowest layer. Further, the variation processing probability 104 of the hierarchical hidden variable may calculate the variation probability so as to maximize the marginal log likelihood.
- the component optimization processing unit 105 optimizes the component with respect to the calculated variation probability.
- the gate function model optimization processing unit 106 optimizes the gate function model based on the variation probability of the hidden variable in the node of the hierarchical hidden structure. For example, when the structure of the hidden variable is a tree structure, the gate function model is a model that determines the branch direction according to the multivariate data at the node of the hierarchical hidden structure.
- the hierarchical hidden variable model for multivariate data is estimated by the above-described configuration, according to the present embodiment, the hierarchical including the hierarchical hidden variable with an appropriate calculation amount without losing the theoretical validity.
- a hidden variable model can be estimated. Further, by using the hierarchical hidden variable model estimation apparatus 100, according to the present embodiment, it is not necessary to manually set a reference suitable for dividing into components.
- the hierarchical hidden structure setting unit 102 sets a hierarchical hidden structure in which the hidden variables are represented by a binary tree structure, and the gate function model optimization processing unit 106 is based on the variation probability of the hidden variables at the nodes.
- a gate function model based on the Bernoulli distribution may be optimized. In this case, since each parameter has an analytical solution, optimization at a higher speed becomes possible.
- the hierarchical hidden variable model estimation apparatus 100 uses the input data 711 based on the value of the explanatory variable in the input data 711, the price model according to the temperature level, the model according to the time zone, and the sales Separated into components such as models according to the day.
- FIG. 9 is a block diagram illustrating a configuration example of a price estimation apparatus 700 according to at least one embodiment of the present invention.
- the price estimation device 700 includes a data input device 701, a model acquisition unit 702, a component determination unit 703, a price prediction unit 704, and a prediction result output device 705.
- the data input device 701 inputs one or more explanatory variables, which are information that can affect the price, as input data 711.
- the types of explanatory variables constituting the input data 711 are the same as the types of explanatory variables in the input data 111.
- the data input device 701 is an example of a predicted data input unit.
- the model acquisition unit 702 acquires a gate function model and components from the model database 500 as models used for price prediction.
- the gate function model is a gate function model optimized by the gate function model optimization processing unit 106.
- the component is a component optimized by the component optimization processing unit 105.
- the component determination unit 703 follows the hierarchical hidden structure based on the input data 711 input by the data input device 701 and the gate function model acquired by the model acquisition unit 702, thereby associating the component associated with the node in the lowest layer To decide. Then, the component determining unit 703 determines the component as a component for predicting the price.
- the price prediction unit 704 predicts the price related to the input data 711 by inputting the input data 711 input by the data input device 701 to the component determined by the component determination unit 703.
- the prediction result output device 705 outputs the prediction result 712 predicted by the price prediction unit 704.
- FIG. 10 is a flowchart showing an operation example of the price estimation apparatus 700 according to at least one embodiment of the present invention.
- the data input device 701 inputs the input data 711 (step S131).
- the data input device 701 may input a plurality of sets of input data 711 instead of a single input data 711 (in each embodiment of the present invention, the input data represents a data set (information group)).
- the data input device 701 may input the input data 711 for each device.
- the price prediction unit 704 predicts a price for each input data 711.
- the model acquisition unit 702 acquires a gate function model and components from the model database 500 (step S132).
- the price estimation apparatus 700 selects the input data 711 one by one, and executes the processes shown from step S134 to step S136 shown below for the selected input data 711 (step S133).
- the component determination unit 703 determines components to be used for price prediction by tracing from the root node of the hierarchical hidden structure to the node in the lowest layer based on the gate function model acquired by the model acquisition unit 702 (step S134). Specifically, the component determination unit 703 determines a component in the following procedure.
- the component determination unit 703 reads the gate function model associated with the node for each node of the hierarchical hidden structure. Next, the component determination unit 703 determines whether or not the input data 711 satisfies the read gate function model. Next, the component determination unit 703 determines a child node to be traced next based on the determination result. When the component determination unit 703 traces the hierarchically hidden node by the processing and reaches the node in the lowest layer, the component determination unit 703 determines a component associated with the node as a component used for price prediction.
- the price prediction unit 704 predicts the price by substituting the input data 711 selected in step S133 for the component (step S135). Then, the prediction result output device 705 outputs the price prediction result 712 by the price prediction unit 704 (step S136).
- the price estimation apparatus 700 completes the processing by executing the processing from step S134 to step S136 for all the input data 711.
- the price estimation apparatus 700 can accurately predict the price by using an appropriate component based on the gate function model.
- the price estimation device 700 is a component classified according to an appropriate criterion. The price can be predicted based on
- the price prediction system according to the present embodiment is different from the price prediction system 10 in that, for example, the hierarchical hidden variable model estimation device 100 is replaced with a hierarchical hidden variable model estimation device 200. To do.
- FIG. 11 is a block diagram showing a configuration example of a hierarchical hidden variable model estimation apparatus according to at least one embodiment of the present invention.
- symbol same as FIG. 3 is attached
- subjected and description is abbreviate
- the hierarchical hidden variable model estimation apparatus 200 of the present embodiment is connected to, for example, a hierarchical hidden structure optimization processing unit 201 to select an optimal model. The difference is that the processing unit 108 is not connected.
- the hierarchical hidden variable model estimation apparatus 100 optimizes a component or a gate function model with respect to a hierarchical hidden structure candidate and generates a hierarchical hidden structure that maximizes the optimization criterion A. select.
- the hierarchical hidden variable model estimation apparatus 200 uses the hierarchical hidden structure optimization processing unit 201 after the processing by the calculation processing unit 104 of the variation probability of the hierarchical hidden variable. A process has been added that removes paths with reduced variables from the model.
- FIG. 12 is a block diagram showing a configuration example of the optimization processing unit 201 having a hierarchical hidden structure according to at least one embodiment of the present invention.
- the hierarchical hidden structure optimization processing unit 201 includes a route hidden variable sum operation processing unit 201-1, a route removal determination processing unit 201-2, and a route removal execution processing unit 201-3.
- the route hidden variable sum calculation processing unit 201-1 receives the variation probability 104-6 of the hierarchical hidden variable, and sums the variation probability of the route hidden variable in the lowest layer in each component (hereinafter referred to as a sample sum). Is calculated.
- the path removal determination processing unit 201-2 determines whether the sample sum is equal to or smaller than a predetermined threshold value ⁇ .
- ⁇ is a threshold value input together with the input data 111.
- the condition determined by the route removal determination processing unit 201-2 can be expressed by, for example, Expression 5. ... (Formula 5)
- the route removal determination processing unit 201-2 determines whether or not the variation probability q (z ij n ) of the route hidden variable in the lowest layer in each component satisfies the criterion represented by Expression 5. In other words, it can be said that the path removal determination processing unit 201-2 determines whether the sample sum is sufficiently small.
- the path removal execution processing unit 201-3 sets the variation probability of the path determined to have a sufficiently small sample sum to zero. Then, the route removal execution processing unit 201-3 uses the variation probability of the route hidden variable in the lowest layer normalized with respect to the remaining route (that is, the route that was not set to 0), and hierarchies are hidden in each layer. The variable variation probability 104-6 of the variable is recalculated and output.
- Expression 6 represents an example of an update expression of q (z ij n ) in iterative optimization. ... (Formula 6)
- the hierarchical hidden structure optimization processing unit 201 (more specifically, a route hidden variable sum operation processing unit 201-1, a route removal determination processing unit 201-2, and a route removal execution processing unit 201-3). Is realized by a CPU of a computer that operates according to a program (a hierarchical hidden variable model estimation program).
- FIG. 13 is a flowchart showing an operation example of the hierarchical hidden variable model estimation apparatus 200 according to at least one embodiment of the present invention.
- the data input device 101 inputs the input data 111 (step S200).
- the hierarchical hidden structure setting unit 102 sets the initial number of hidden states as the hierarchical hidden structure (step S201).
- the optimum solution is searched by executing all the plurality of candidates for the number of components.
- the hierarchical hidden structure can be optimized by a single process. Therefore, in step S201, as shown in step S102 in the first embodiment, it is only necessary to set an initial value of the number of hidden states once instead of selecting a candidate that has not been optimized from a plurality of candidates. .
- the initialization processing unit 103 performs initialization processing such as parameters used for estimation and variation probability of hidden variables on the set hierarchical hidden structure (step S202).
- the hierarchical hidden variable variation probability calculation processing unit 104 calculates the variation probability of each path hidden variable (step S203).
- the hierarchical hidden structure optimization processing unit 201 optimizes the hierarchical hidden structure by estimating the number of components (step S204). That is, since the components are arranged in the nodes in the lowest layers, the number of components is optimized when the hierarchical hidden structure is optimized.
- the component optimization processing unit 105 optimizes the component by estimating the type and parameter of the observation probability for each component (step S205).
- the gate function model optimization processing unit 106 optimizes the branch parameters in each branch node (step S206).
- the optimality determination processing unit 107 determines whether or not the optimization criterion A has converged (step S207). That is, the optimality determination processing unit 107 determines the optimality of the model.
- step S207 when it is not determined that the optimization criterion A has converged (that is, when it is determined that it is not optimal) (No in step S207a), the processing from step S203 to step S207 is repeated.
- step S207a when it is determined in step S207 that the optimization criterion A has converged (that is, when it is determined to be optimal) (Yes in step S207a), the model estimation result output device 109 outputs the model estimation result.
- the estimation result 112 is output and the process is completed (step S208).
- FIG. 14 is a flowchart showing an operation example of the hierarchical hidden structure optimization processing unit 201 according to at least one embodiment of the present invention.
- the route hidden variable sum operation processing unit 201-1 calculates a sample sum of route hidden variables (step S211).
- the path removal determination processing unit 201-2 determines whether or not the calculated sample sum is sufficiently small (step S212).
- the path removal execution processing unit 201-3 outputs the variation probability of the hierarchical hidden variable that is recalculated with the variation probability of the path hidden variable in the lowest layer determined that the sample sum is sufficiently small as 0, The process is completed (step S213).
- the hierarchical hidden structure optimization processing unit 201 optimizes the hierarchical hidden structure by excluding routes whose calculated variation probability is equal to or less than a predetermined threshold from the model.
- the price prediction system according to the present embodiment is different from the second embodiment in the configuration of a hierarchical hidden variable model estimation device, for example.
- the hierarchical hidden variable model estimation apparatus according to the present embodiment includes, for example, a gate function model optimization processing unit 106 that performs a gate function model optimization processing unit. 113 is different.
- FIG. 15 is a block diagram showing a configuration example of the gate function model optimization processing unit 113 according to at least one embodiment of the present invention.
- the gate function model optimization processing unit 113 includes an effective branch node selection unit 113-1 and a branch parameter optimization parallel processing unit 113-2.
- the effective branch node selection unit 113-1 selects an effective branch node from the hierarchical hidden structure. Specifically, the effective branch node selection unit 113-1 uses the estimation model 104-5 estimated by the component optimization processing unit 105, and considers the route removed from the model so that it is effective. Select branch nodes. That is, a valid branch node represents a branch node on a route that has not been removed from the hierarchical hidden structure.
- the branch parameter optimization parallel processing unit 113-2 performs the branch parameter optimization processing on the valid branch nodes in parallel, and outputs the processing result as the gate function model 106-6.
- the branch parameter optimization parallel processing unit 113-2 includes the input data 111 and the hierarchical hidden variable variation probability 104 calculated by the hierarchical hidden variable variation probability calculation unit 104. -6 to optimize branch parameters for all valid branch nodes in parallel.
- the branch parameter optimization parallel processing unit 113-2 may be configured by, for example, arranging the branch parameter optimization processing units 106-3 of the first embodiment in parallel as illustrated in FIG. With such a configuration, branch parameters of all gate function models can be optimized at one time.
- the hierarchical hidden variable model estimation apparatuses 100 and 200 execute the optimization process of the gate function model one by one, but the hierarchical hidden variable model estimation apparatus of the present embodiment is the gate function. Since model optimization processing can be performed in parallel, faster model estimation is possible.
- the gate function model optimization processing unit 113 (more specifically, the effective branch node selection unit 113-1 and the branch parameter optimization parallel processing unit 113-2) includes a program (hierarchical hidden variable). This is realized by a CPU of a computer that operates according to a model estimation program.
- FIG. 16 is a flowchart showing an operation example of the gate function model optimization processing unit 113 according to at least one embodiment of the present invention.
- the valid branch node selection unit 113-1 selects all valid branch nodes (step S301).
- the parallel processing unit 113-2 for branch parameter optimization optimizes all the valid branch nodes in parallel and completes the processing (step S302).
- the effective branch node selection unit 113-1 selects an effective branch node from the nodes having the hierarchical hidden structure.
- the parallel processing unit 113-2 for branch parameter optimization optimizes the gate function model based on the variation probability of the hidden variable at the valid branch node.
- the branch parameter optimization parallel processing unit 113-2 processes the optimization of each branch parameter related to an effective branch node in parallel. Therefore, since the optimization process of the gate function model can be performed in parallel, the model can be estimated at a higher speed in addition to the effect of the above-described embodiment.
- FIG. 17 is a block diagram showing a basic configuration of a hierarchical hidden variable model estimation apparatus according to at least one embodiment of the present invention.
- the hierarchical hidden variable model estimation device estimates a hierarchical hidden variable model that predicts a price related to an object.
- the hierarchical hidden variable model estimation apparatus includes a learning information input unit 80, a variation probability calculation unit 81, a hierarchical hidden structure setting unit 82, a component optimization processing unit 83, a gate function, as a basic configuration.
- a model optimization unit 84 is provided.
- the learning information input unit 80 inputs learning data that is a plurality of combinations of an objective variable that is a known price and one or more explanatory variables that are information that can affect the price.
- An example of the learning information input unit 80 is the data input device 101.
- the hierarchical hidden structure setting unit 82 sets, for example, a hierarchical hidden structure in which a hidden variable is represented by a tree structure and a component representing a probability model is arranged at a node in the lowest layer of the tree structure.
- An example of the hierarchical hidden structure setting unit 82 is the hierarchical hidden structure setting unit 102.
- the variation probability calculation unit 81 includes a path hidden variable that is a hidden variable included in a path connecting the root node to the target node in the hierarchical hidden structure.
- a variation probability (eg, optimization criterion A) is calculated.
- An example of the variation probability calculation unit 81 is a calculation processing unit 104 for a variation probability of a hierarchical hidden variable.
- the component optimization processing unit 83 optimizes the component with respect to the calculated variation probability based on the learning data input by the learning information input unit 80.
- An example of the component optimization processing unit 83 is the component optimization processing unit 105.
- the gate function model optimizing unit 84 optimizes the gate function model, which is a model for determining the branch direction according to the explanatory variable, in the hierarchically hidden structure node based on the variation probability of the hidden variable in the node.
- An example of the gate function model optimization unit 84 is a gate function model optimization processing unit 106.
- the hierarchical hidden variable model estimation apparatus can estimate a hierarchical hidden variable model including a hierarchical hidden variable with an appropriate amount of calculation without losing theoretical validity.
- the hierarchical hidden variable model estimation apparatus optimizes a hierarchical hidden structure by excluding a route having a calculated variation probability equal to or less than a predetermined threshold from the model (for example, a hierarchical hidden structure optimization unit (for example, , A hierarchical hidden structure optimization processing unit 201) may be provided. That is, the hierarchical hidden variable model estimation device includes a hierarchical hidden structure optimization unit that optimizes the hierarchical hidden structure by excluding paths from which the calculated variation probability does not satisfy the criterion. Also good. With such a configuration, it is not necessary to optimize a plurality of hierarchical hidden structure candidates, and the number of components can be optimized in one execution process.
- the gate function model optimizing unit 84 selects an effective branch node that is a branch node of a route that is not excluded from the hierarchical hidden structure from the nodes of the hierarchical hidden structure (for example, An effective branch node selection unit 113-1) may be included.
- the gate function model optimization unit 84 is a parallel processing unit for branch parameter optimization that optimizes the gate function model based on the variation probability of the hidden variable at the effective branch node (for example, parallel processing for branch parameter optimization).
- a processing unit 113-2) may be included.
- the parallel processing unit for branch parameter optimization may process optimization of each branch parameter related to an effective branch node in parallel. Such a configuration enables faster model estimation.
- the hierarchical hidden structure setting unit 82 may set a hierarchical hidden structure in which the hidden variable is represented by a binary tree structure. Then, the gate function model optimization unit 84 may optimize the gate function model based on the Bernoulli distribution based on the variation probability of the hidden variable at the node. In this case, since each parameter has an analytical solution, optimization at a higher speed becomes possible.
- variation probability calculation unit 81 may calculate the variation probability of the hidden variable so as to maximize the marginal log likelihood.
- FIG. 18 is a block diagram showing a basic configuration of a price estimation device 93 according to at least one embodiment of the present invention.
- the price estimation device 93 includes a prediction data input unit 90, a component determination unit 91, and a price prediction unit 92.
- the prediction data input unit 90 inputs prediction data that is one or more explanatory variables that are information that can affect the price.
- An example of the prediction data input unit 90 is a data input device 701.
- the component determination unit 91 includes a hierarchical hidden structure in which hidden variables are represented in a hierarchical structure, and a component representing a probability model is arranged at a node in the lowest layer of the hierarchical structure, and a branch direction in the node of the hierarchical hidden structure
- the component used for price prediction is determined based on the gate function model for determining the price and the forecast data.
- An example of the component determining unit 91 is a component determining unit 703.
- the price prediction unit 92 predicts the price based on the component determined by the component determination unit 91 and the prediction data.
- An example of the price prediction unit 92 is a price prediction unit 704.
- the price estimation apparatus can accurately predict the price by using an appropriate component based on the gate function model.
- FIG. 19 is a schematic block diagram showing a configuration of a computer according to at least one embodiment of the present invention.
- the computer 1000 includes a CPU 1001, a main storage device 1002, an auxiliary storage device 1003, and an interface 1004.
- the hierarchical hidden variable model estimation device and price estimation device are each implemented in the computer 1000.
- the computer 1000 on which the hierarchical hidden variable model estimation device is mounted may be different from the computer 1000 on which the price estimation device is mounted.
- the operation of each processing unit according to at least one embodiment is stored in the auxiliary storage device 1003 in the form of a program (a hierarchical hidden variable model estimation program or a price prediction program).
- the CPU 1001 reads out the program from the auxiliary storage device 1003, expands it in the main storage device 1002, and executes the above processing according to the program.
- the auxiliary storage device 1003 is an example of a tangible medium that is not temporary.
- Other examples of the non-temporary tangible medium include a magnetic disk, a magneto-optical disk, a CD-ROM (Compact__Disc_Read_Only_Memory), a DVD (Digital_Versatile_Disc) -ROM, and a semiconductor memory connected via the interface 1004.
- the computer 1000 that has received the distribution may develop the program in the main storage device 1002 and execute the above processing.
- the program may realize a part of the functions described above.
- the program may be a file (program) that realizes the above-described function in combination with another program already stored in the auxiliary storage device 1003, a so-called difference file (difference program).
- FIG. 20 is a block diagram showing a configuration of the estimation apparatus 310 according to the fourth embodiment of the present invention.
- the estimation apparatus 310 includes an estimation unit 311.
- the estimation unit 311 receives the second information 410 including one or more explanatory variables when estimating the price of the object in the second period.
- At least one of the explanatory variables is an attribute that represents a length of a period between the first period and the second period when a specific event occurs regarding the object.
- a specific event is an event that occurs with respect to the object. For example, when the object is a device, the specific event is an event such as purchasing the device, servicing the device, or starting to sell a device assigned the same model number as the device. The specific event may be an event such as starting to sell an upgraded device while being a device assigned a model number different from that of the device.
- the second time represents, for example, the time when the vehicle is scheduled to be sold.
- FIG. 21 is a diagram conceptually illustrating an example of a first information set according to at least one embodiment of the present invention.
- the first information set includes a plurality of first information.
- the first information associates the value of one or more explanatory variables with the value (price) of the objective variable. That is, the first information associates an object (represented as “second object” for convenience of description) with a price related to the second object.
- At least one of the explanatory variables is an attribute that represents the length of a period between the first period related to the object represented by the first information and the third period associated with the objective variable.
- the attribute is a use period, a period until maintenance, or a period until an accessory is replaced.
- the attribute is not limited to the example shown in FIG.
- the second line in FIG. 21 represents an example of the first information. That is, the first information includes 3 (a value indicating a period of use), 10 (a value indicating a period until maintenance), 1 (a value indicating a period until replacement of an accessory), and 100 (a price) Value).
- the above-mentioned third time represents the time when the vehicle was sold, for example.
- the use period represents a period between the time when the vehicle is purchased and the time when the vehicle is sold.
- the specific event represents an event of purchasing the vehicle.
- the period until maintenance represents a period between the time when the vehicle is sold and the time when the next maintenance is required for the vehicle.
- the specific event represents an event of maintaining the vehicle.
- the period until maintenance represents the remaining period of legal vehicle inspection (vehicle inspection) related to the vehicle.
- the period until the accessory is replaced represents a period between the time when the vehicle is sold and the time when the accessory needs to be replaced next.
- the specific event represents the event of the next replacement for accessories attached to the vehicle.
- the accessory is a wheel of the vehicle.
- a rule may be extracted by a method such as a support vector machine, a neural network, or a decision tree.
- the hierarchical hidden variable model estimation device may extract rule information (such as components and gate function models).
- the estimation unit 311 applies the rule information 411 to the second information 410 and calculates the result as a price 412.
- the estimation device 310 can predict the price of the object in the second period with high accuracy.
- the sale price is often related to the remaining period of the vehicle inspection at the time of sale.
- the estimation device 310 predicts a price to be predicted based on the rule information 411 representing the relationship.
- the price predicted by the estimation device 310 is a more accurate price because it is based on the relationship between the remaining period and the sale price.
- the price for selling the equipment is affected by the period until the accessories that make up the equipment are replaced, the elapsed time since the equipment is purchased, and the like.
- the price predicted by the estimation apparatus 310 is a more accurate price because it is based on the period until the exchange and the relationship between the elapsed period and the sale price.
- the estimation apparatus 310 it is possible to predict the price of the object in the second period with high accuracy.
- FIG. 22 is a block diagram showing the configuration of the price estimation apparatus 97 according to the fifth embodiment of the present invention.
- the price estimation apparatus 97 includes a prediction data input unit 94, a component determination unit 91, and a price prediction unit 92.
- the price estimation device 97 may further include a learning data input unit 95 and a variation probability calculation unit 96.
- the prediction data input unit 94 inputs second information that is one or more explanatory variables that are information that can affect the price. However, at least one of the explanatory variables is the attribute shown in the fourth embodiment of the present invention.
- An example of the prediction data input unit 94 is a data input device 701.
- the component determination unit 91 determines a component based on the second information in the prediction data input unit 94.
- the price prediction unit 92 predicts the price based on the component determined by the component determination unit 91.
- At least one of the explanatory variables is an attribute shown in the fourth embodiment of the present invention. Therefore, according to the price estimation device 97 according to the fifth embodiment, it is possible to predict the price with high accuracy based on the same reason as the reason described in the fourth embodiment.
- the price estimation device 97 further includes a learning data input unit 95 and a variation probability calculation unit 96 in addition to the above-described configuration.
- the learning data input unit 95 inputs first information that is a plurality of combinations of an objective variable that is a price and one or more explanatory variables that are information that can affect the price.
- An example of the learning data input unit 95 is a data input device 101.
- the variation probability calculation unit 96 is a hidden variable included in a route connecting the root node to the target node in the hierarchical hidden structure. Compute the variational probability (eg, optimization criterion A) of the hidden variable. At this time, the variation probability calculation unit 81 arranges the above-described attributes on the route. As an example of the variation probability calculation unit 96, a variation probability calculation processing unit 104 of a hierarchical hidden variable is given.
- FIG. 23 is a diagram illustrating an example of a gate function model and components calculated by the price estimation device 97 when the hidden variable model according to at least one embodiment of the present invention has a tree structure.
- a condition regarding a specific explanatory variable (in this case, a random variable) is assigned to each node (node 2302 and node 2303) in the tree structure. That is, the variation probability calculation unit 96 arranges the above-described attributes as explanatory variables.
- the node 2302 represents a condition regarding whether or not the attribute value is 3 or more (condition information 2308).
- the node 2303 represents a condition (condition information 2310) regarding whether or not the value of the explanatory variable B is 5.
- the variation probability calculation unit 96 places the above-described attributes in the node 2302.
- the probability of selecting the branch A1 is 0.05 based on the probability information 2307, and the branch A2 is It is assumed that 0.95 is selected.
- the probability of selecting the branch A1 is 0.8 based on the probability information 2307, and the probability of selecting the branch A2 is 0. .2.
- the probability of selecting the branch B1 based on the probability information 2309 is 0.25. Assume that the probability of selecting the branch B2 is 0.75.
- the probability of selecting the branch B1 is 0.7 based on the probability information 2309, and the probability of selecting the branch B2 is Suppose that it is 0.3.
- the probability regarding the component is calculated using the gate function model, The component with the highest probability is selected.
- the price estimation device 97 can predict the price more accurately.
- the price estimation device 97 includes the configuration of the hierarchical hidden variable model estimation device according to the above-described embodiments of the present invention.
- FIG. 24 is a block diagram showing a configuration of the price estimation device 131 according to the sixth embodiment of the present invention.
- FIG. 25 is a flowchart showing the flow of processing in the price estimation apparatus 131 according to the sixth embodiment.
- the price estimation apparatus 131 includes a prediction data input unit 132, a component determination unit 133, and a price prediction unit 134. Furthermore, the price estimation device 131 includes a learning data input unit 135, a data selection unit 136, and a variation probability calculation unit 137.
- Learning data input part 135 inputs the 1st information set which consists of the 1st information which is a plurality of combinations of the objective variable which is a price, and one or more explanatory variables which are the information which can affect the price concerned.
- An example of the learning data input unit 135 is the data input device 101.
- Each first information is associated with the first time when the objective variable (price) related to the object associated with the first information is determined.
- the prediction data input unit 132 inputs second information that is one or more explanatory variables that are information that may affect the price.
- An example of the prediction data input unit 132 is a data input device 701.
- the second information is associated with the second time when the price is predicted for the object associated with the second information.
- the data selection unit 136 selects specific first information from the first information set based on the second time (step S1001).
- the data selection unit 136 selects, from the first information set, specific first information in which the period between the second time and the first time associated with the first information is equal to or less than a specific value. .
- the data selection part 136 may select the specific 1st information which is before a 2nd time and the period of a 1st time and a 2nd time is below a specific value.
- the data selection unit 136 may select a specific number of pieces of first information in the order from the shortest period between the first period and the second period.
- the processing in the data selection unit 136 is not limited to the example described above.
- the variation probability calculation unit 137 calculates the variation probability based on the specific first information selected by the data selection unit 136 (step S1002). As an example of the variation probability calculation unit 137, a variation probability calculation processing unit 104 of a hierarchical hidden variable is given.
- the component determination unit 133 determines a component based on the second information.
- the component determination unit 133 includes a hidden layer structure in which hidden variables are represented in a hierarchical structure, and a component representing a probability model is arranged in a node in the lowest layer of the hierarchical structure, and a node in the hidden hierarchical structure.
- the component is determined according to the gate function model that determines the branching direction in FIG.
- An example of the component determination unit 133 is a component determination unit 703.
- the price prediction unit 134 predicts the price in the second period related to the second information based on the component selected by the component determination unit 133 (step S1003).
- An example of the price prediction unit 134 is a price prediction unit 704.
- the price estimation device 131 can predict the price with higher accuracy.
- the reason is, for example, reason 1 and reason 2. That is, (Reason 1)
- the configuration of the price estimation apparatus according to the sixth embodiment includes the configuration of the price estimation apparatus according to the above-described embodiment. (Reason 2) Because the second information and the first information are similar (or coincident), a component suitable for classifying the second information, a gate function model, and the like can be created.
- the data selection unit 136 selects the first information associated with the first period close to the second period as described above. For example, when the object is a vehicle of a specific vehicle type, the price of the vehicle tends to become similar (or coincide) as the sale time is closer. Therefore, the first information and the second information are similar (or matched) to each other by the data selection unit 136 performing the above-described processing.
- FIG. 26 is a block diagram showing a configuration of the price estimation apparatus 121 according to the seventh embodiment of the present invention.
- FIG. 27 is a flowchart showing the flow of processing in the price estimation apparatus 121 according to the seventh embodiment.
- the price estimation apparatus 121 includes a prediction data input unit 122, a component determination unit 123, a price prediction unit 124, and a second price conversion unit 125.
- the price estimation apparatus 121 further includes a learning data input unit 126, a first price conversion unit 127, and a component optimization processing unit 128.
- the learning data input unit 126 inputs a first information set including first information that is a plurality of combinations of an objective variable that is a price and one or more explanatory variables that are information that can affect the price.
- An example of the learning data input unit 126 is the data input device 101.
- the first price conversion unit 127 calculates the second price by applying a specific conversion function to the objective variable (price) in the first information set input to the learning data input unit 126 (Ste S1101).
- the first price conversion unit 127 creates the third information by associating the calculated second price with the explanatory variable associated with the price on which the second price is calculated. That is, the first price conversion unit 127 calculates a third information set including the third information based on the first information set.
- the specific conversion function is a monotonous predetermined function whose slope is not constant, such as an exponential function or a logarithmic function.
- the component optimization processing unit 128 optimizes the component with respect to the calculated variation probability based on the third information set (step S1102).
- An example of the component optimization processing unit 128 is the component optimization processing unit 105.
- the prediction data input unit 122 inputs second information that is one or more explanatory variables that are information that may affect the price.
- An example of the prediction data input unit 122 is a data input device 701.
- the component determination unit 123 includes a hierarchical hidden structure in which hidden variables are expressed in a hierarchical structure, and a component representing a probability model is arranged at a node in the lowest layer of the hierarchical structure, and a node of the hierarchical hidden structure
- the component used for price prediction is determined based on the gate function model that determines the branching direction and the second information.
- the component is a component optimized by the component optimization processing unit 128 based on the third information set.
- An example of the component determination unit 123 is a component determination unit 703.
- the price prediction unit 124 predicts the second price based on the component determined by the component determination unit 123 and the second information (step S1103).
- An example of the price prediction unit 124 is a price prediction unit 704.
- the second price conversion unit 125 calculates a price by defining an inverse function of a specific conversion function applied by the first price conversion unit 127 to the second price predicted by the price prediction unit 124 ( Step S1104).
- the price estimation apparatus 121 in addition to the effects described above, it is possible to accurately predict a specific price range.
- the reason is, for example, reason 1 and reason 2. That is, (Reason 1)
- the configuration of the price estimation apparatus 121 according to the seventh embodiment includes the configuration of the price estimation apparatus according to the above-described embodiment. (Reason 2) Because the first price conversion unit 127 and the second price conversion unit 125 use a specific conversion function, the price difference in a specific price range increases.
- the price estimation device 121 can correctly predict the low price range. Furthermore, when the specific conversion function is a logarithmic function, the value of the inverse function of the specific conversion function is always a positive value. In this case, since the price predicted by the price estimation device 121 is always positive, the price estimation device 121 also has an effect of calculating a more reasonable price.
- the price estimation device 121 can correctly predict the high price range.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Strategic Management (AREA)
- Physics & Mathematics (AREA)
- Entrepreneurship & Innovation (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Game Theory and Decision Science (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Optimization (AREA)
- Mathematical Analysis (AREA)
- Computational Mathematics (AREA)
- Artificial Intelligence (AREA)
- Algebra (AREA)
- Probability & Statistics with Applications (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
価格に影響を与え得る1つ以上の説明変数である予測データを入力する予測データ入力手段と、
各階層に1以上のノードが配され、第1階層に配されたノードと、下位の第2階層に配されたノードとの間に経路を有する階層構造によって隠れ変数が表され、当該階層構造の最下層におけるノードに確率モデルを表すコンポーネントが配された構造である階層隠れ構造と、前記コンポーネントを決定する場合に、当該階層隠れ構造を構成するノード間における前記経路を決定する基である門関数モデルと、前記予測データとに基づいて、前記価格の予測に用いる前記コンポーネントを決定するコンポーネント決定手段と、
前記コンポーネント決定手段が決定した前記コンポーネントと、前記予測データとに基づいて、前記価格を予測する価格予測手段と
を備える。
情報処理装置が、価格に影響を与え得る1つ以上の説明変数である予測データを入力し、各階層に1以上のノードが配され、第1階層に配されたノードと、下位の第2階層に配されたノードとの間に経路を有する階層構造によって隠れ変数が表され、当該階層構造の最下層におけるノードに確率モデルを表すコンポーネントが配された構造である階層隠れ構造と、前記コンポーネントを決定する場合に、当該階層隠れ構造を構成するノード間における前記経路を決定する基である門関数モデルと、前記予測データとに基づいて、前記価格の予測に用いる前記コンポーネントを決定し、前記コンポーネント決定手段が決定した前記コンポーネントと、前記予測データとに基づいて、前記価格を予測する。
予測すべき対象である価格は、たとえば、中古建物、中古車、中古装置、中古ゲーム機、古着等の価格である。また、予測対象である価格は、たとえば、売買を仲介する仲介業者が買い取る買い取り価格や、仲介業者が販売する場合の販売価格である。
図1は、本発明の第1の実施形態に係る価格予測システムが有する構成の一例を表すブロック図である。
・・・・・・・・・・・・・・・・・・・・・・・(式3)
・・・・・・・・・・・・・・・・・・(式A)
以降の説明において、S1,・・・,SK1×K2は、φkに対応する観測確率の種類を表すとする。たとえば、多変量データの生成確率の場合、S1~SK1×K2になり得る候補は、正規分布、対数正規分布、または、指数分布等である。また、たとえば、多項曲線が出力される場合、S1~SK1×K2になり得る候補は、0次曲線、1次曲線、2次曲線、または、3次曲線等である。
・階層隠れ構造の設定部102、
・初期化処理部103、
・階層的な隠れ変数の変分確率の計算処理部104(より詳しくは、最下層における経路隠れ変数の変分確率の計算処理部104-1と、階層設定部104-2と、上層における経路隠れ変数の変分確率の計算処理部104-3と、階層計算終了の判定処理部104-4)、
・コンポーネントの最適化処理部105、
・門関数モデルの最適化処理部106(より詳しくは、分岐ノードの情報取得部106-1と、分岐ノードの選択処理部106-2と、分岐パラメータの最適化処理部106-3と、全分岐ノードの最適化終了の判定処理部106-4)、
・最適性の判定処理部107、
・最適モデルの選択処理部108。
・階層隠れ構造の設定部102、
・初期化処理部103、
・階層的な隠れ変数の変分確率の計算処理部104(より詳しくは、最下層における経路隠れ変数の変分確率の計算処理部104-1と、階層設定部104-2と、上層における経路隠れ変数の変分確率の計算処理部104-3と、階層計算終了の判定処理部104-4)、
・コンポーネントの最適化処理部105、
・門関数モデルの最適化処理部106(より詳しくは、分岐ノードの情報取得部106-1と、分岐ノードの選択処理部106-2と、分岐パラメータの最適化処理部106-3と、全分岐ノードの最適化終了の判定処理部106-4)、
・最適性の判定処理部107、
・最適モデルの選択処理部108。
・階層隠れ構造の設定部102、
・初期化処理部103、
・階層的な隠れ変数の変分確率の計算処理部104、
・コンポーネントの最適化処理部105、
・門関数モデルの最適化処理部106、
・最適性の判定処理部107、
・最適モデルの選択処理部108。
次に、価格予測システムの第2の実施形態について説明する。本実施形態に係る価格予測システムは、たとえば、価格予測システム10と比較して、階層的な隠れ変数モデルの推定装置100が階層的な隠れ変数モデルの推定装置200に置き換わっているということが相違する。
・・・・・・・・・・・・・・・・・・・(式5)
次に、価格予測システムの第3の実施形態について説明する。本実施形態に係る価格予測システムは、たとえば、階層的な隠れ変数モデルの推定装置の構成が第2の実施形態と異なる。本実施形態の階層的な隠れ変数モデルの推定装置は、階層的な隠れ変数モデルの推定装置200と比較して、たとえば、門関数モデルの最適化処理部106が門関数モデルの最適化処理部113に置き換わったということが相違する。
次に、階層的な隠れ変数モデルの推定装置の基本構成について説明する。図17は、本発明の少なくとも1つの実施形態に係る階層的な隠れ変数モデルの推定装置の基本構成を示すブロック図である。
次に、本発明の第4の実施形態について説明する。
次に、上述した実施形態を基本とする本発明の第5の実施形態について説明する。
次に、上述した実施形態を基本とする本発明の第6の実施形態について説明する。
(理由1)第6の実施形態に係る価格推定装置が有する構成は、上述した実施形態に係る価格推定装置が有する構成を含むからである、
(理由2)第2情報と第1情報とが類似(または一致)するので、第2情報を分類するのに適したコンポーネント、及び、門関数モデル等が作成できるからである。
次に、上述した実施形態を基本とする本発明の第7の実施形態について説明する。
(理由1)第7の実施形態に係る価格推定装置121が有する構成は、上述した実施形態に係る価格推定装置が有する構成を含むからである、
(理由2)第1価格変換部127、及び、第2価格変換部125が、特定の変換関数を用いることにより、特定の価格帯における価格の差異が大きくなるからである。
100 階層的な隠れ変数モデルの推定装置
300 学習データベース
500 モデルデータベース
700 価格推定装置
111 入力データ
101 データ入力装置
102 階層隠れ構造の設定部
103 初期化処理部
104 階層的な隠れ変数の変分確率の計算処理部
105 コンポーネントの最適化処理部
106 門関数モデルの最適化処理部
107 最適性の判定処理部
108 最適モデルの選択処理部
109 モデルの推定結果の出力装置
112 モデルの推定結果
104-1 最下層における経路隠れ変数の変分確率の計算処理部
104-2 階層設定部
104-3 上層における経路隠れ変数の変分確率の計算処理部
104-4 階層計算終了の判定処理部
104-5 推定モデル
104-6 階層隠れ変数の変分確率
701 データ入力装置
702 モデル取得部
703 コンポーネント決定部
704 価格予測部
705 予測結果出力装置
711 入力データ
712 予測結果
200 階層的な隠れ変数モデルの推定装置
201 階層隠れ構造の最適化処理部
201-1 経路隠れ変数の和演算処理部
201-2 経路除去の判定処理部
201-3 経路除去の実行処理部
113 門関数モデルの最適化処理部
113-1 有効な分岐ノードの選別部
113-2 分岐パラメータの最適化の並列処理部
106-1 分岐ノードの情報取得部
106-2 分岐ノードの選択処理部
106-3 分岐パラメータの最適化処理部
106-4 全分岐ノードの最適化終了の判定処理部
106-6 門関数モデル
80 学習情報入力部
81 変分確率計算部
82 階層隠れ構造の設定部
83 コンポーネントの最適化処理部
84 門関数モデルの最適化部
90 予測データ入力部
91 コンポーネント決定部
92 価格予測部
93 価格推定装置
1000 コンピュータ
1001 CPU
1002 主記憶装置
1003 補助記憶装置
1004 インタフェース
310 推定装置
311 推定部
410 第2情報
411 ルール情報
412 価格
94 予測データ入力部
95 学習データ入力部
96 変分確率計算部
97 価格推定装置
2301 学習情報
2302 ノード
2303 ノード
2304 コンポーネント
2305 コンポーネント
2306 コンポーネント
2307 確率情報
2308 条件情報
2309 確率情報
2310 条件情報
131 価格推定装置
132 予測データ入力部
133 コンポーネント決定部
134 価格予測部
135 学習データ入力部
136 データ選択部
137 変分確率計算部
121 価格推定装置
122 予測データ入力部
123 コンポーネント決定部
124 価格予測部
125 第2価格変換部
126 学習データ入力部
127 第1価格変換部
128 コンポーネントの最適化処理部
Claims (12)
- 価格に影響を与え得る1つ以上の説明変数である予測データを入力する予測データ入力手段と、
各階層に1以上のノードが配され、第1階層に配されたノードと、下位の第2階層に配されたノードとの間に経路を有する階層構造によって隠れ変数が表され、当該階層構造の最下層におけるノードに確率モデルを表すコンポーネントが配された構造である階層隠れ構造と、前記コンポーネントを決定する場合に、当該階層隠れ構造を構成するノード間における前記経路を決定する基である門関数モデルと、前記予測データとに基づいて、前記価格の予測に用いる前記コンポーネントを決定するコンポーネント決定手段と、
前記コンポーネント決定手段が決定した前記コンポーネントと、前記予測データとに基づいて、前記価格を予測する価格予測手段と
を備える階層的な価格推定装置。 - 前記隠れ変数の確率分布を表す変分確率が基準を満たさない前記経路を、前記階層隠れ構造において最適化処理を実行する処理対象から除外することにより、前記階層隠れ構造を最適化する最適化手段
を備える請求項1に記載の価格推定装置。 - 前記経路において、前記階層隠れ構造から除外されていない分岐ノードを表す有効な分岐ノードを、当該階層隠れ構造におけるノードから選別する選別手段と、
前記有効な分岐ノードにおける前記隠れ変数の前記変分確率に基づいて、前記門関数モデルを最適化する並列処理手段と
を含む最適化手段を
さらに備え、
前記並列処理手段は、前記有効な分岐ノードに関する各分岐パラメータの最適化を並列に処理する
請求項2に記載の価格推定装置。 - 前記隠れ変数が2分木構造を用いて表される前記階層隠れ構造を設定する設定手段と、
各ノードにおける前記隠れ変数の確率分布を表す変分確率に基づいて、ベルヌーイ分布を基とする前記門関数モデルを最適化する最適化手段と
をさらに備える請求項1乃至請求項3のいずれかに記載の価格推定装置。 - 周辺化対数尤度を最大化するように前記隠れ変数の確率分布を表す変分確率を計算する変分確率計算手段
をさらに備える請求項1乃至請求項3のいずれかに記載の価格推定装置。 - 説明変数の値と価格の値とを関連付けする第1情報を含む第1情報セットに基づき抽出される、前記説明変数と前記価格との間に成り立つ関係を表すルール情報を、前記説明変数を含む第2情報に適用することにより、予測対象である第2時期における前記第2情報に関する前記価格を予測する価格予測手段
を備え、
前記説明変数は、前記第1情報または前記第2情報に関連付けされた対象物に関して、特定の事象が生じる第1時期に基づき決まる期間を表す属性を含み
前記第2情報における前記属性の値は、前記第1時期と、前記第2時期との間の期間であり、
前記第1情報における前記属性の値は、前記第1時期と、前記価格に関連付けされた第3時期との間の期間である
価格推定装置。 - 少なくとも1つの前記説明変数は、前記第2情報が表す対象物に関する前記第2時期における価格を推定する場合に、前記対象物に関して特定の事象が生じる第1時期と、前記第2時期との間の期間を表す属性である
請求項6に記載の価格推定装置。 - 隠れ変数が階層構造で表され、当該階層構造の最下層におけるノードに確率モデルを表すコンポーネントが配された構造である階層隠れ構造における経路に前記属性を配置し、その後、周辺化対数尤度を最大化するように隠れ変数の変分確率を計算する変分確率計算手段
をさらに備える請求項7に記載の価格推定装置。 - 予測対象が、特定の時期に関する価格である場合に、前記説明変数及び前記価格が関連付けされた第1情報を含む第1情報セットの中から、前記特定の時期に基づいて、特定の第1情報を選択するデータ選択手段
をさらに備え、
前記変分確率計算手段は、前記特定の第1情報に基づき、前記変分確率を計算する
請求項8に記載の価格推定装置。 - 前記説明変数及び前記価格が関連付けされた第1情報のうち、前記価格に対数関数あるいは指数関数を表す変換関数を適用し、適用した結果算出される第2価格と、前記第1情報において前記価格に関連付けされた前記説明変数とを関連付けすることにより、第3情報を作成する第1価格変換手段と、
前記第3情報に基づき、前記コンポーネントを最適化するコンポーネントの最適化処理手段と、
前記価格予測手段が予測した前記価格に、前記変換関数の逆関数を適用することにより、前記予測データに関する前記価格を予測する第2価格変換手段と、
を備える請求項7乃至請求項9のいずれかに記載の価格推定装置。 - 情報処理装置が、価格に影響を与え得る1つ以上の説明変数である予測データを入力し、各階層に1以上のノードが配され、第1階層に配されたノードと、下位の第2階層に配されたノードとの間に経路を有する階層構造によって隠れ変数が表され、当該階層構造の最下層におけるノードに確率モデルを表すコンポーネントが配された構造である階層隠れ構造と、前記コンポーネントを決定する場合に、当該階層隠れ構造を構成するノード間における前記経路を決定する基である門関数モデルと、前記予測データとに基づいて、前記価格の予測に用いる前記コンポーネントを決定し、前記コンポーネント決定手段が決定した前記コンポーネントと、前記予測データとに基づいて、前記価格を予測する価格予測方法。
- 価格に影響を与え得る1つ以上の説明変数である予測データを入力する予測データ入力機能と、
各階層に1以上のノードが配され、第1階層に配されたノードと、下位の第2階層に配されたノードとの間に経路を有する階層構造によって隠れ変数が表され、当該階層構造の最下層におけるノードに確率モデルを表すコンポーネントが配された構造である階層隠れ構造と、前記コンポーネントを決定する場合に、当該階層隠れ構造を構成するノード間における前記経路を決定する基である門関数モデルと、前記予測データとに基づいて、前記価格の予測に用いる前記コンポーネントを決定するコンポーネント決定機能と、
前記コンポーネント決定手段が決定した前記コンポーネントと、前記予測データとに基づいて、前記価格を予測する価格予測機能と
をコンピュータに実現させる価格推定プログラムを記録する記録媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016509950A JP6451736B2 (ja) | 2014-03-28 | 2015-02-27 | 価格推定装置、価格推定方法、及び、価格推定プログラム |
US15/125,267 US20170076307A1 (en) | 2014-03-28 | 2015-02-27 | Price estimation device, price estimation method, and recording medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201461971594P | 2014-03-28 | 2014-03-28 | |
US61/971,594 | 2014-03-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015145979A1 true WO2015145979A1 (ja) | 2015-10-01 |
Family
ID=54194535
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/001024 WO2015145979A1 (ja) | 2014-03-28 | 2015-02-27 | 価格推定装置、価格推定方法、及び、記録媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20170076307A1 (ja) |
JP (1) | JP6451736B2 (ja) |
WO (1) | WO2015145979A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2021065882A1 (ja) * | 2019-09-30 | 2021-04-08 | ||
JP2021511557A (ja) * | 2018-05-17 | 2021-05-06 | アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited | ブロックチェーンベースのリソース価値評価方法および装置 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017046906A1 (ja) * | 2015-09-16 | 2017-03-23 | 株式会社日立製作所 | データ分析装置および分析方法 |
CN110675176B (zh) * | 2018-07-03 | 2023-03-24 | 百度在线网络技术(北京)有限公司 | 用于生成属性预测模型的方法和装置 |
CN116957635B (zh) * | 2023-09-20 | 2023-12-26 | 中国华能集团清洁能源技术研究院有限公司 | 电力价格获取方法、装置、电子设备及存储介质 |
CN117934049B (zh) * | 2024-03-18 | 2024-05-17 | 中国电子科技集团公司第十五研究所 | 多层级成本计算优化方法、装置、电子设备及存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001344463A (ja) * | 2000-05-30 | 2001-12-14 | System Location Co Ltd | 車両再販価格分析システム |
JP2003242325A (ja) * | 2002-02-14 | 2003-08-29 | Kao Corp | 購入意思決定支援装置 |
JP2006099432A (ja) * | 2004-09-29 | 2006-04-13 | Toshiba Corp | 電力取引システム、電力取引方法、電力取引プログラム |
JP2007183769A (ja) * | 2006-01-05 | 2007-07-19 | Osaka Gas Co Ltd | 契約支援システム |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008100902A1 (en) * | 2007-02-12 | 2008-08-21 | Pricelock, Inc. | System and method for estimating forward retail commodity price within a geographic boundary |
-
2015
- 2015-02-27 JP JP2016509950A patent/JP6451736B2/ja active Active
- 2015-02-27 US US15/125,267 patent/US20170076307A1/en not_active Abandoned
- 2015-02-27 WO PCT/JP2015/001024 patent/WO2015145979A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001344463A (ja) * | 2000-05-30 | 2001-12-14 | System Location Co Ltd | 車両再販価格分析システム |
JP2003242325A (ja) * | 2002-02-14 | 2003-08-29 | Kao Corp | 購入意思決定支援装置 |
JP2006099432A (ja) * | 2004-09-29 | 2006-04-13 | Toshiba Corp | 電力取引システム、電力取引方法、電力取引プログラム |
JP2007183769A (ja) * | 2006-01-05 | 2007-07-19 | Osaka Gas Co Ltd | 契約支援システム |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021511557A (ja) * | 2018-05-17 | 2021-05-06 | アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited | ブロックチェーンベースのリソース価値評価方法および装置 |
US11250481B2 (en) | 2018-05-17 | 2022-02-15 | Advanced New Technologies Co., Ltd. | Blockchain-based resource value evaluation methods and apparatus |
JP7060690B2 (ja) | 2018-05-17 | 2022-04-26 | アドバンスド ニュー テクノロジーズ カンパニー リミテッド | ブロックチェーンベースのリソース価値評価方法および装置 |
US11410207B2 (en) | 2018-05-17 | 2022-08-09 | Advanced New Technologies Co., Ltd. | Blockchain-based resource value evaluation methods and apparatus |
JPWO2021065882A1 (ja) * | 2019-09-30 | 2021-04-08 | ||
WO2021065882A1 (ja) * | 2019-09-30 | 2021-04-08 | ダイキン工業株式会社 | 空気調和機の残価算出システムおよび空気調和機の支援システム |
CN114846277A (zh) * | 2019-09-30 | 2022-08-02 | 大金工业株式会社 | 空调机的剩余价值算出系统以及空调机的辅助系统 |
JP7485965B2 (ja) | 2019-09-30 | 2024-05-17 | ダイキン工業株式会社 | 空気調和機の残価算出システム |
CN114846277B (zh) * | 2019-09-30 | 2024-05-28 | 大金工业株式会社 | 空调机的剩余价值算出系统以及空调机的辅助系统 |
Also Published As
Publication number | Publication date |
---|---|
US20170076307A1 (en) | 2017-03-16 |
JP6451736B2 (ja) | 2019-01-16 |
JPWO2015145979A1 (ja) | 2017-04-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6451736B2 (ja) | 価格推定装置、価格推定方法、及び、価格推定プログラム | |
JP6344395B2 (ja) | 払出量予測装置、払出量予測方法、プログラム、及び、払出量予測システム | |
JP6179598B2 (ja) | 階層隠れ変数モデル推定装置 | |
JP6459968B2 (ja) | 商品推薦装置、商品推薦方法、及び、プログラム | |
CN109657805B (zh) | 超参数确定方法、装置、电子设备及计算机可读介质 | |
JP6969637B2 (ja) | 因果関係分析方法および電子デバイス | |
JP7392668B2 (ja) | データ処理方法および電子機器 | |
WO2015166637A1 (ja) | メンテナンス時期決定装置、劣化予測システム、劣化予測方法および記録媒体 | |
JP6344396B2 (ja) | 発注量決定装置、発注量決定方法、プログラム、及び、発注量決定システム | |
JP6451735B2 (ja) | エネルギー量推定装置、エネルギー量推定方法、及び、エネルギー量推定プログラム | |
US11915798B2 (en) | Material characteristic prediction apparatus and material characteristic prediction method | |
JP6330901B2 (ja) | 階層隠れ変数モデル推定装置、階層隠れ変数モデル推定方法、払出量予測装置、払出量予測方法、及び記録媒体 | |
US20220004959A1 (en) | Method and system for prefiltering in ride-hailing platforms | |
WO2016009599A1 (ja) | Cm計画支援システムおよび売上予測支援システム | |
CN117933073A (zh) | 一种用于探索cpu微架构的设计空间的方法和装置 | |
Arbelaez et al. | Learning sequential and parallel runtime distributions for randomized algorithms | |
CN110826814A (zh) | 备件库存确定方法、备件库存确定装置和电子设备 | |
Alajlan et al. | Optimization of COCOMO-II Model for Effort and Development Time Estimation using Genetic Algorithms | |
US20230119396A1 (en) | Smart product sales and manufacturing | |
Long et al. | Machine Learning-Based Predictive Models for Energy Consumption Estimation in Energy-Efficient Building Envelope Design | |
Roh et al. | Enhancing algorithmic base for discrete choice modelling | |
CN118503482A (zh) | 布尔可满足性问题求解器的运算时间预测方法及相关设备 | |
PERNOT et al. | On the use of Machine Learning to Defeature CAD Models for Simulation | |
Tertilt et al. | Easy-to-use SAP sizing based on evolutionary generated scalability models | |
Higuchi et al. | A Hybrid Method to Improve Forecasting Accuracy Utilizing Genetic Algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15770167 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15125267 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2016509950 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15770167 Country of ref document: EP Kind code of ref document: A1 |