WO2024071072A1 - Material recommendation device, material recommendation method, and program - Google Patents
Material recommendation device, material recommendation method, and program Download PDFInfo
- Publication number
- WO2024071072A1 WO2024071072A1 PCT/JP2023/034824 JP2023034824W WO2024071072A1 WO 2024071072 A1 WO2024071072 A1 WO 2024071072A1 JP 2023034824 W JP2023034824 W JP 2023034824W WO 2024071072 A1 WO2024071072 A1 WO 2024071072A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- parameter
- emotion word
- input
- value
- sensory
- Prior art date
Links
- 239000000463 material Substances 0.000 title claims abstract description 328
- 238000000034 method Methods 0.000 title claims description 32
- 230000008451 emotion Effects 0.000 claims abstract description 212
- 238000000605 extraction Methods 0.000 claims abstract description 75
- 230000035807 sensation Effects 0.000 claims abstract description 45
- 239000000284 extract Substances 0.000 claims abstract description 22
- 238000012937 correction Methods 0.000 claims description 67
- 230000001953 sensory effect Effects 0.000 claims description 66
- 239000004615 ingredient Substances 0.000 claims description 34
- 230000000694 effects Effects 0.000 claims description 32
- 238000011156 evaluation Methods 0.000 claims description 19
- 230000002996 emotional effect Effects 0.000 abstract description 11
- 235000019615 sensations Nutrition 0.000 description 32
- 238000003860 storage Methods 0.000 description 30
- 239000011159 matrix material Substances 0.000 description 26
- 238000010586 diagram Methods 0.000 description 25
- 239000002649 leather substitute Substances 0.000 description 17
- 230000000007 visual effect Effects 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 8
- 230000004438 eyesight Effects 0.000 description 6
- 239000005003 food packaging material Substances 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000000513 principal component analysis Methods 0.000 description 5
- 239000000654 additive Substances 0.000 description 4
- 230000000996 additive effect Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 4
- 239000004566 building material Substances 0.000 description 4
- 238000000556 factor analysis Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000002457 bidirectional effect Effects 0.000 description 2
- 238000005315 distribution function Methods 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000000704 physical effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/28—Databases characterised by their database models, e.g. relational or object models
Definitions
- the present invention relates to a material recommendation device, a material recommendation method, and a program.
- This application claims priority based on Japanese Patent Application No. 2022-152385, filed on September 26, 2022, the contents of which are incorporated herein by reference.
- Indicators that users use as evaluation criteria when selecting a product include, for example, the characteristics and performance of the materials used in the product.
- the impression that users get, such as how the material is perceived through the five senses (sight, touch, hearing, taste, smell) and how the material appeals to the senses, are also important indicators that users use as evaluation criteria when selecting a product.
- the similar texture material recommendation system described in Patent Document 1 stores parameter values in advance that indicate the correspondence between terms (onomatopoeia) that express texture and the physical properties of the surface of a material, and automatically presents recommended materials for an input onomatopoeia based on the parameter values.
- the similar tactile material recommendation system described in Patent Document 1 is a system that only accepts input using onomatopoeia that expresses the sense of touch. In reality, however, the impression a user gets from a material is not necessarily an impression obtained through the sense of touch. The impression a user gets from a material may include impressions obtained through multiple or all of the five senses.
- the present invention was made in consideration of these circumstances, and aims to provide an ingredient recommendation device, ingredient recommendation method, and program that can recommend ingredients according to a person's emotions.
- An ingredient recommendation device comprising: an input unit that accepts input of information indicating a product type and an emotion word indicating a word expressing an emotion; an ingredient parameter extraction unit that extracts a sensory parameter value for each ingredient corresponding to the product type input to the input unit based on ingredient parameter information in which materials used in products belonging to the product type are associated with sensory parameter values indicating the degree of sensation a human feels for the material; an emotion word parameter extraction unit that extracts a sensory parameter value corresponding to the emotion word input to the input unit based on emotion word parameter information in which the emotion word is associated with the sensory parameter values; and a recommended ingredient determination unit that determines the ingredient recommended for the emotion word input to the input unit based on the sensory parameter value for each ingredient extracted by the ingredient parameter extraction unit and the sensory parameter value extracted by the emotion word parameter extraction unit.
- a material recommendation method comprising: an input step of accepting input of information indicating a product type and an emotion word indicating a word expressing an emotion; a material parameter extraction step of extracting a sensory parameter value for each material corresponding to the product type input in the input step based on material parameter information in which materials used in products belonging to the product type are associated with sensory parameter values indicating the degree of sensation a human feels for the material; an emotion word parameter extraction step of extracting a sensory parameter value corresponding to the emotion word input in the input step based on emotion word parameter information in which the emotion word is associated with the sensory parameter value; and a recommended material determination step of determining the material recommended for the emotion word input in the input step based on the sensory parameter value for each material extracted in the material parameter extraction step and the sensory parameter value extracted in the emotion word parameter extraction step.
- the present invention makes it possible to recommend ingredients based on a person's emotions.
- FIG. 1 is a block diagram showing a functional configuration of a material recommendation device 100 according to an embodiment of the present disclosure.
- FIG. 2 is a diagram showing an example of an emotion word input to the material recommendation device 100 according to an embodiment of the present disclosure.
- FIG. 2 is a diagram illustrating an example of five sense parameters used in the material recommendation device 100 according to an embodiment of the present disclosure.
- 1 is a diagram showing an example of material parameter information 111 stored in a parameter information storage unit 110 of a material recommendation device 100 according to an embodiment of the present disclosure.
- FIG. FIG. 2 is a diagram showing an example of emotion word parameter information 112 stored in a parameter information storage unit 110 of the material recommendation device 100 according to an embodiment of the present disclosure.
- FIG. 2 is a diagram illustrating an example of modal correction information 113 stored in a parameter information storage unit 110 of the material recommendation device 100 according to an embodiment of the present disclosure.
- FIG. 2 is a diagram illustrating an example of modal correction information 113 stored in a parameter information storage unit 110 of the material recommendation device 100 according to an embodiment of the present disclosure.
- 4 is a flowchart showing an operation of the material recommendation device 100 according to an embodiment of the present disclosure.
- 11A and 11B are diagrams illustrating an example of a correction process for material parameter values performed by the modal correction unit 133 of the material recommendation device 100 according to an embodiment of the present disclosure.
- FIG. 11 is a diagram illustrating an example of a correction process for the values of emotion word parameters performed by a modal correction unit 133 of the material recommendation device 100 according to an embodiment of the present disclosure.
- FIG. 11 is a diagram illustrating an example of a calculation process of a distance between coordinates performed by a recommended material determination unit 134 of the material recommendation device 100 according to an embodiment of the present disclosure.
- 1 is a diagram showing an example of information output by an output unit 140 of the material recommendation device 100 according to an embodiment of the present disclosure.
- FIG. FIG. 13 is a diagram showing another example of information output by the output unit 140 of the material recommendation device 100 according to an embodiment of the present disclosure.
- Fig. 1 is a block diagram showing the functional configuration of the material recommendation device 100 according to an embodiment of the present disclosure.
- the material recommendation device 100 is an information processing device configured using, for example, a general-purpose computer.
- the material recommendation device 100 accepts input of words expressing the user's desired feeling for the materials used in the product and expressing the user's emotions (hereinafter referred to as "emotion words").
- the material recommendation device 100 determines at least one recommended material based on the input emotion words, and outputs information indicating the determined material.
- FIG. 1 is a block diagram showing the functional configuration of a material recommendation device 100 in a first embodiment of the present disclosure.
- the material recommendation device 100 includes a parameter information storage unit 110, an input unit 120, a material parameter extraction unit 131, an emotion word parameter extraction unit 132, a modal correction unit 133, a recommended material determination unit 134, and an output unit 140.
- the parameter information storage unit 110 is configured using a computer-readable storage medium.
- the computer-readable storage medium referred to here is, for example, a magnetic hard disk device or a semiconductor storage device.
- the parameter information storage unit 110 pre-stores various types of parameter information used to determine recommended materials. As shown in FIG. 1, the parameter information storage unit 110 stores at least material parameter information 111, emotion word parameter information 112, and modal correction information 113. Details of each piece of parameter information will be explained later.
- the input unit 120 is configured using an input device that accepts input operations by a user.
- the input device here is, for example, a keyboard, a mouse, a touch panel, and an input button.
- the input unit 120 is configured to include a product type input unit 121, an emotion word input unit 122, and a usage status input unit 123.
- the product type input unit 121 acquires information indicating the product type input by the user (hereinafter referred to as "product type information").
- product type information is information indicating the type of material used in the product, such as, for example, "food packaging material,” “housing building material,” and “synthetic leather material for bags.”
- the emotion word input unit 122 acquires information indicating emotion words input by the user (hereinafter simply referred to as "emotion words").
- the emotion word input unit 122 outputs the acquired emotion words to the emotion word parameter extraction unit 132.
- the emotion words referred to here are words such as adjectives used to quantify human sensations in general sensory evaluations. Such adjectives include words that are not limited to a specific sensation among the five senses, such as "favorite,” “unique,” “comfortable,” and “luxurious.”
- the SD method is a technique for measuring a person's impression (sensitivity) of an object using paired adjectives with opposite meanings, and evaluating it numerically.
- the SD method is characterized by its ability to express the structure of an impression with a small number of factors.
- the SD method finds a structure consisting of three basic factors that can be interpreted as an evaluation factor, an activity factor, and a potency factor. This structure is called the EPA (Evaluation, Potency, Activity) structure.
- Evaluation factors are expressed by pairs of adjectives that express a comprehensive evaluation of the value of an object, such as “like-dislike,” “good-bad,” and “beautiful-ugly.”
- evaluation factors are also composed of pairs of adjectives that express the nature of an object, such as “stable-unstable,” “clear-muddy,” and “warm-cold,” although this differs depending on the type of object being evaluated (shape, color, sound, etc.).
- Activity factors are expressed by adjective pairs such as “noisy-quiet,” “dynamic-static,” and “flashy-subdued.”
- Potency factors are expressed by adjective pairs such as "hard-soft,” “sharp-dull,” and “tense-relaxed.”
- FIG. 2 is a diagram showing an example of emotion words input to the material recommendation device 100 in one embodiment of the present disclosure.
- the evaluative factor and the active factor contain many adjectives that directly express human emotions.
- words classified as the evaluative factor and the active factor are defined as emotion words.
- both pairs of adjectives are defined separately as one emotion word.
- the usage status input unit 123 acquires information indicating the usage status of the product (hereinafter referred to as "usage status information") input by the user.
- the usage status input unit 123 outputs the acquired usage status information to the modal correction unit 133.
- the usage information referred to here is information that can identify which of the five senses the user receives when receiving an impression from a product. Furthermore, when a user receives impressions through more than one of the five senses, the usage information is information that can identify the order in which the impressions are received through each sense. For example, when a user receives impressions through two of the five senses, sight and touch, the usage information is information that can identify whether the user sees the product first and then touches it, or whether the user touches the product first and then sees it.
- the material parameter extraction unit 131, the emotion word parameter extraction unit 132, the modal correction unit 133, and the recommended material determination unit 134 are functional units that are realized by a processor such as a CPU (Central Processing Unit) reading and executing a program that is pre-stored in a storage unit (not shown), for example.
- a processor such as a CPU (Central Processing Unit) reading and executing a program that is pre-stored in a storage unit (not shown), for example.
- a processor such as a CPU (Central Processing Unit) reading and executing a program that is pre-stored in a storage unit (not shown), for example.
- a storage unit and the parameter information storage unit 110 may be configured using the same storage medium.
- the material parameter extraction unit 131 acquires the product type information output from the product type input unit 121.
- the material parameter extraction unit 131 also refers to the material parameter information 111 stored in the parameter information storage unit 110.
- the material parameter extraction unit 131 extracts, from the material parameter information 111, the values of the material parameters associated with the product type that corresponds to the acquired product type information.
- the material parameter information 111 is information that indicates the degree of association between a material and the five sense parameters.
- the five sense parameters referred to here are composed of items that indicate the five senses, or items that indicate sensations that are further subdivided from each of the five senses. In this embodiment, the five sense parameters are composed of items that indicate sensations that are further subdivided from each of the five senses.
- FIG. 3 is a diagram showing an example of the five sense parameters used in the material recommendation device 100 in one embodiment of the present disclosure. As shown in FIG. 3, touch is further classified into the five sense parameters of "hardness,” “roughness,” “friction,” and “warm/cold.” Vision is further classified into the five sense parameters of "color 1,” “color 2,” “color 3,” “texture 1,” “texture 2,” and “texture 3.”
- touch can be classified into four elements: hardness, roughness, friction, and warmth/coldness. More specifically, for example, the values of average roughness Ra and maximum height Rz defined by the standard (JIS B 0601) may be used as the five sensory parameters for touch "roughness". Alternatively, parameters and values obtained by subjecting the average roughness Ra and maximum height Rz, etc. to principal component analysis or factor analysis may be used to reduce the dimensions. For example, the value of Young's modulus calculated based on the pressing force and displacement may be used as the five sensory parameters for touch "hardness”. For example, the value of the dynamic friction coefficient when touching a material may be used as the five sensory parameters for touch "friction”.
- the material recommendation device 100 When the material recommendation device 100 is adapted to various materials, it is desirable to obtain uniformly evaluated parameter values for all materials with respect to touch, for example using a wearable device. By attaching a wearable device to a person's hand and actually touching the material, friction, hardness, etc. can be uniformly measured.
- the five sense parameters of vision shown in FIG. 3 include “Color 1", “Color 2”, and “Color 3", but it is generally known that color can be expressed using three variables, such as L*a*b* or L*C*H. For example, these three variables may be used as the five sense parameters of "Color 1", “Color 2", and “Color 3", respectively.
- the material recommendation device 100 may use the three variables in any color space, but it is necessary to evaluate all materials using the three variables in the same color space.
- texture refers to physical property information related to reflectance, such as the bidirectional reflectance distribution function (BRDF).
- BRDF Bidirectional Reflectance Distribution Function
- CG Computer Graphics
- the type and amount of taste chemical components are thought to be factors that affect people's impressions.
- smell the type and concentration of olfactory chemical components are thought to be factors that affect people's impressions.
- the signal strength and power spectrum are thought to be factors that affect people's impressions.
- the value of the material parameter may be, for example, the actual measurement value obtained by some measurement method, or the material parameter value of all materials may be a standardized value. However, if the actual measurement value is used, the unit of the material parameter value for all materials must be unified. For example, if the average roughness Ra value specified by the standard is used as the material parameter of "roughness", the material parameter value of "roughness” for all materials should be unified to a unit value such as millimeters or micrometers.
- FIG. 4 is a diagram showing an example of material parameter information 111 stored in the parameter information storage unit 110 of the material recommendation device 100 in one embodiment of the present disclosure.
- the material parameter information 111 is data indicating the association between a material and the five sensory parameters described above.
- the five sensory parameters associated with a material will be referred to as "material parameters.”
- the product type, product number, information indicating the material composition of the product, and material parameters are associated with each other.
- Each row of the material parameter information 111 is information about one material.
- the value "food packaging material” is registered in the “product type” field, and the value "101a001b001” is set in the "product number” field.
- the material to which the product number "101a001b001” is assigned is a material used for food packaging material.
- the value "101" is set in the "main component” field of the “product material composition” field
- the value "a001” is set in the “additive 1” field
- the value "b001” is set in the “additive 2” field.
- the material to which the product number "101a001b001” is assigned is a material that includes a main component assigned with the identification information "101", an additive assigned with the identification information "a001", and an additive assigned with the identification information "b001”.
- each material parameter in the material parameter information 111 a standardized value based on, for example, 0 is registered. Standardization is an operation in which the value of each material parameter is subtracted from the average value of the entire database of each material parameter and divided by the standard deviation, resulting in a standard deviation with an average of 0 and a variance of 1.
- each material is quantified by the five sense parameters (material parameters).
- the value of the five sense parameter "hardness" for product number "101a001b001" is "0.12”
- the value of the five sense parameter "warm/cold" is "0.03".
- taste may slightly affect the impression of food packaging materials, but has no effect on the impression of building materials for homes or synthetic leather materials for bags.
- the emotion word parameter extraction unit 132 acquires emotion words output from the emotion word input unit 122. As shown in an example in FIG. 2, information words are words that express human emotions.
- the emotion word parameter extraction unit 132 also refers to emotion word parameter information 112 stored in the parameter information storage unit 110.
- the emotion word parameter extraction unit 132 extracts the value of an emotion word parameter associated with the acquired emotion word from the emotion word parameter information 112.
- Emotion word parameter information 112 is information that indicates the degree of association between emotion words and five sense parameters.
- the five sense parameters are composed of items that indicate sensations that are further subdivided into each of the five senses, for example, as shown in FIG. 3.
- FIG. 5 is a diagram showing an example of emotion word parameter information 112 stored in the parameter information storage unit 110 of the material recommendation device 100 in one embodiment of the present disclosure.
- the emotion word parameter information 112 is data indicating the association between emotion words and the five sense parameters described above.
- the five sense parameters associated with emotion words are referred to as "emotion word parameters.”
- emotion words and emotion word parameters are associated with each other.
- Each row of the emotion word parameter information 112 is information about one emotion word.
- the value of the five sense parameter "texture 1" for the emotion word “beautiful” is "0.73"
- the value of the five sense parameter "texture 2" is "-0.15".
- each emotion word is quantified in advance using five sense parameters (emotion word parameters).
- emotion word parameters One possible method for quantifying the emotion words is to perform a sensory evaluation of each ingredient using emotion words on multiple subjects in advance, and then create a regression model between the emotion words and the five sense parameters to determine the numerical value.
- these emotion word parameters may use standardized values using the entire emotion word database.
- Standardization is an operation in which the value of each emotion word parameter is subtracted from the average value of the entire database of emotion word parameters and divided by the standard deviation, resulting in a standard deviation with an average of 0 and a variance of 1. In this way, each emotion word is quantified using the five sense parameters (emotion word parameters).
- the modal correction unit 133 acquires the usage information output from the usage input unit 123.
- the usage information is information that specifies the situation in which the user is impressed and according to which of the five senses.
- the modal correction unit 133 refers to the modal correction information 113 stored in the parameter information storage unit 110.
- the modal correction unit 133 corrects the values of the material parameters extracted by the material parameter extraction unit 131 and the values of the emotion word parameters extracted by the emotion word parameter extraction unit 132 based on the acquired usage information and the modal correction information 113.
- the modal correction unit 133 identifies, based on the usage situation information, the order in which the impressions are received through each sense.
- the modal correction unit 133 extracts, from the modal correction information 113, a set of coefficients for taking into account the cross-modal effect occurring in the identified usage situation.
- the modal correction unit 133 multiplies the extracted set of coefficients by the values of the material parameters extracted by the material parameter extraction unit 131 and the values of the emotion word parameters extracted by the emotion word parameter extraction unit 132, respectively.
- a cross-modal effect refers to a relationship between impressions received through two of the five senses, such as vision and touch.
- a multi-modal effect refers to a relationship between impressions received through multiple senses, such as vision, touch, and hearing.
- the cross-modal effect also includes the meaning of the multi-modal effect.
- a user when a user recognizes a bag, they may look at it visually and then touch it to check its texture, or conversely, they may touch it to check its texture and then look at it visually.
- the overall impression that the user gets from a material will differ depending on whether the impression is received first by the sense of sight or touch.
- materials such as synthetic leather for bags give the user different impressions depending on the order in which multiple senses are activated, creating a cross-modal effect.
- the material recommendation device 100 in this embodiment takes into account the cross-modal effect according to the usage situation based on the acquired usage situation information. Specifically, as described above, the modal correction unit 133 multiplies the values of the material parameters extracted by the material parameter extraction unit 131 and the values of the emotion word parameters extracted by the emotion word parameter extraction unit 132 by a set of coefficients for taking the cross-modal effect into account. As a result, the values of the material parameters and the emotion word parameters are corrected to values that reflect the influence of the cross-modal effect according to the usage situation information.
- modal correction information 113 including a group of coefficients to be multiplied by the material parameter values for example, modal correction information 113 shown in FIG. 6
- modal correction information 113 including a group of coefficients to be multiplied by the emotional word parameter values for example, modal correction information 113 shown in FIG. 7 are pre-stored separately in the parameter information storage unit 110.
- FIGS. 6 and 7 are diagrams showing an example of modal correction information 113 stored in the parameter information storage unit 110 of the material recommendation device 100 in one embodiment of the present disclosure.
- Figure 6 shows an example of modal correction information 113 including a group of coefficients by which the material parameter values are multiplied.
- Figure 6 shows a group of coefficients corresponding to each piece of usage information, which are used to correct the material parameter values.
- the coefficient value of the sensation received first is preset to 1
- the coefficient value of the sensation received later is preset to a value greater than 0 and less than 1
- the coefficient values of the other sensations are preset to 0.
- a method for quantifying such coefficients similar to the above-mentioned case where emotional words are quantified in advance using the five sense parameters, a method can be considered in which a sensory evaluation is carried out in advance on multiple subjects using each material for each usage situation, and a regression model between the usage situation and the material is created to determine the numerical value.
- FIG. 7 shows an example of modal correction information 113 including a group of coefficients by which the value of the emotion word parameter is multiplied.
- FIG. 7 shows a group of coefficients corresponding to each piece of usage information, which are used to correct the value of the emotion word parameter.
- the coefficient value of the sensation that produces a cross-modal effect is preset to 1
- the coefficient values of the other sensations are preset to 0.
- the material parameter extraction unit 131 outputs information indicating the material parameters corrected by the modal correction unit 133 to the recommended material determination unit 134.
- the emotion word parameter extraction unit 132 outputs information indicating the emotion word parameters corrected by the modal correction unit 133 to the recommended material determination unit 134.
- the recommended material determination unit 134 acquires information indicating the material parameters output from the material parameter extraction unit 131 and information indicating the emotion word parameters output from the emotion word parameter extraction unit 132. Based on the acquired information, the recommended material determination unit 134 calculates the distance between the coordinates of the material parameter value and the emotion word parameter value for each material. The recommended material determination unit 134 determines the material to be recommended based on the calculated distance. For example, the recommended material determination unit 134 determines the material with the shortest calculated distance, or a predetermined number of materials in order of shortest calculated distance, as the recommended materials.
- the recommended material determination unit 134 outputs information indicating the recommended material to the output unit 140. Note that the recommended material determination unit 134 may also output information indicating the distance between the material parameter value and the emotional word parameter value for each recommended material.
- a method of simply comparing the distance between the material parameter value and the emotion parameter value is used to select materials that match the emotion words, but other methods, such as a method of solving an inverse problem such as Bayesian optimization, or a method of solving a forward problem that minimizes error by performing a grid search, may be used. It is also possible to configure a system to perform the above tasks by using AutoML (Automated Machine Learning) such as DataRobot (registered trademark).
- AutoML Automated Machine Learning
- DataRobot registered trademark
- the output unit 140 is configured using an output device that presents information to the user.
- the output device here is, for example, a display device such as a liquid crystal display (LCD), a cathode ray tube (CRT), or an organic electro-luminescence (EL) display.
- the output unit 140 may also be a communication interface that transmits information to an external device that presents the information to the user.
- the external device is, for example, an information processing device such as a general-purpose computer, various displays, or a printer.
- the output unit 140 acquires the information indicating the recommended materials output from the recommended material determination unit 134.
- the output unit 140 presents the information indicating the recommended materials to the user.
- FIG. 8 is a flowchart showing the operation of the material recommendation device 100 according to an embodiment of the present disclosure.
- the product type input unit 121 accepts input of product type information by the user. For example, a list of values for the "product type” item, which is the leftmost column of the material parameter information 111 shown in FIG. 4, ("food packaging material,” “residential building material,” “synthetic leather for bags,” ...) is displayed on a display device such as a display. The user performs an input operation to select the desired product type from the displayed list using an input device such as a mouse. Note that, as an example, it is assumed here that the user has selected the product type of "synthetic leather for bags.”
- the product type input unit 121 outputs product type information indicating "synthetic leather material for bags" to the material parameter extraction unit 131.
- the material parameter extraction unit 131 acquires the product type information output from the product type input unit 121 (step S01).
- the emotion word input unit 122 accepts the input of emotion words by the user. For example, a list of emotion words as shown in FIG. 2 ("favorite,” “flashy,” “adorable,” 7) is displayed on a display device such as a display. The user performs an input operation to select a desired emotion word from the displayed list using an input device such as a mouse. Note that, as an example, the user has selected the emotion word "beautiful.” Note that this is not a limitation to this example, and multiple emotion words may be selected. In this case, the following process is executed for each emotion word.
- the emotion word input unit 122 outputs the emotion word "beautiful" to the emotion word parameter extraction unit 132.
- the emotion word parameter extraction unit 132 acquires the emotion word output from the emotion word input unit 122 (step S02).
- the usage status input unit 123 accepts the input of usage status information by the user. For example, a list of values for the "first" and “last" items of "cross-modal effect" ("tactile”, “none”, “visual”, “none”, “auditory”, “none”, ...) in the second column from the left of the modal correction information 113 shown in FIG. 6 or 7 is displayed on a display device such as a display. The user imagines the usage situation when using the product and performs an input operation to select the desired usage status information from the displayed list using an input device such as a mouse. Note that, as an example, it is assumed here that the user selects usage status information in which the first sensation received is "visual" and the second sensation received is "tactile".
- the usage status input unit 123 outputs usage status information indicating that the first sensation is "visual" and the second sensation is "tactile” to the modal correction unit 133.
- the modal correction unit 133 acquires the usage status information output from the usage status input unit 123 (step S03).
- a list of options such as “Decide whether to purchase just by looking at the bag” or “Decide whether to purchase after checking the feel of the bag after looking at it” may be displayed on a display device such as a display, and the user may select from the list.
- the parameter information storage unit 110 stores information in advance that associates these options with one or more senses. That is, for example, information in advance is stored in which "Decide whether to purchase just by looking at the bag” is associated with "sight,” and “Decide whether to purchase after checking the feel of the bag after looking at it” is associated with “sight” and “touch.”
- the modal correction unit 133 can determine by referring to the above information stored in advance in the parameter information storage unit 110 that the usage status information selected is one in which the first sensation received is "sight" and the second sensation received is "touch.”
- the material parameter extraction unit 131 extracts the values of the material parameters associated with the product type corresponding to the acquired product type information from the material parameter information 111 pre-stored in the parameter information storage unit 110 (step S04). For example, since the product type "synthetic leather for bags" has been selected by the user here, the material parameter extraction unit 131 extracts all the data (values of the material parameters) in the rows in which the value of the "product type” item in the material parameter information 111 shown in FIG. 4 is "synthetic leather for bags".
- the material parameter extraction unit 131 will extract m (5) matrices consisting of 1 row and n columns (hereinafter referred to as "Matrix A").
- the modal correction unit 133 corrects the values of the material parameters extracted by the material parameter extraction unit 131 based on the acquired usage information and the modal correction information 113 (step S05).
- the usage status information selected by the user is one in which the first sensation is "visual” and the second sensation is "tactile". Therefore, the modal correction unit 133 extracts data (a group of coefficients) from a row in which the value of the "first" item in the "cross-modal effect” item of the modal correction information 113 shown in FIG. 6 is “visual” and the value of the "last” item is "tactile”. If the number of material parameters associated with each material is n, the modal correction unit 133 extracts one matrix consisting of 1 row and n columns. The modal correction unit 133 then generates a diagonal matrix consisting of n rows and n columns (hereinafter referred to as "diagonal matrix B”) with the extracted matrix as its diagonal components.
- diagonal matrix B a diagonal matrix consisting of n rows and n columns
- the modal correction unit 133 corrects the values of the material parameters by multiplying each of the extracted matrices A by the generated diagonal matrix B. This causes the values of the material parameters to be corrected to values that take into account the cross-modal effects according to the selected usage information.
- FIG. 9 is a diagram showing an example of a material parameter value correction process performed by the modal correction unit 133 of the material recommendation device 100 in one embodiment of the present disclosure.
- FIG. 9 shows a matrix calculation when correcting the material parameter values of a synthetic leather material for a bag that is assigned the product number "301a201b101".
- a matrix consisting of corrected material parameter values is calculated by multiplying matrix A consisting of material parameter values extracted from the material parameter information 111 shown in FIG. 4 by diagonal matrix B generated from a group of coefficients extracted from the modal correction information 113 shown in FIG. 6 ("-0.09", "0.71", “0.69", "0.26", ).
- the emotion word parameter extraction unit 132 extracts the value of the emotion word parameter associated with the acquired emotion word from the emotion word parameter information 112 pre-stored in the parameter information storage unit 110 (step S06). For example, since the emotion word "beautiful" has been selected by the user here, the emotion word parameter extraction unit 132 extracts data (emotion word parameter value) from the row in which the value of the "emotion word” item in the emotion word parameter information 112 shown in FIG. 5 is "beautiful.”
- the emotion word parameter extraction unit 132 extracts m matrices (hereinafter referred to as "matrix C") consisting of one row and n columns.
- matrix C m matrices
- the value of m represents the number of emotion words selected by the user.
- the emotion word parameter extraction unit 132 extracts one matrix C.
- the modal correction unit 133 corrects the values of the emotion word parameters extracted by the emotion word parameter extraction unit 132 based on the acquired usage information and the modal correction information 113 (step S07).
- the usage information selected by the user is one in which the first sensation is “visual” and the second sensation is “tactile".
- the modal correction unit 133 extracts data (a group of coefficients) from rows in which the value of the "first" item in the "cross-modal effect” item of the modal correction information 113 shown in FIG. 7 is “visual” and the value of the "last” item is "tactile”. If the number of emotion word parameters associated with each emotion word is n, the modal correction unit 133 extracts one matrix consisting of 1 row and n columns. The modal correction unit 133 then generates a diagonal matrix consisting of n rows and n columns (hereinafter referred to as "diagonal matrix D”) with the extracted matrix as its diagonal components.
- diagonal matrix D a diagonal matrix consisting of n rows and n columns
- the modal correction unit 133 corrects the values of the emotion word parameters by multiplying each of the extracted matrices C (here, one) by the generated diagonal matrix D. In this way, the emotion word parameter values are corrected to values that take into account the cross-modal effect.
- FIG. 10 is a diagram showing an example of the correction process of the emotion word parameter values performed by the modal correction unit 133 of the material recommendation device 100 in one embodiment of the present disclosure.
- FIG. 10 shows a matrix calculation when correcting the emotion word parameter value corresponding to the emotion word "beautiful".
- a matrix C consisting of the emotion word parameter values extracted from the material parameter information 111 shown in FIG. 5 is multiplied by a diagonal matrix D generated from a group of coefficients extracted from the modal correction information 113 shown in FIG. 7 to calculate a matrix consisting of the corrected emotion word parameter values ("0.03", "-0.81", "-0.62", "0.03", ).
- the material parameter extraction unit 131 outputs the corrected material parameter values to the recommended material determination unit 134.
- the emotion word parameter extraction unit 132 outputs the corrected emotion word parameter values to the recommended material determination unit 134.
- the recommended material determination unit 134 obtains the corrected material parameter values output from the material parameter extraction unit 131 and the corrected emotion word parameter values output from the emotion word parameter extraction unit 132.
- the recommended material determination unit 134 regards the obtained material parameter matrix and emotion word parameter matrix as coordinates, and calculates the distance between the coordinates for each material.
- the recommended material determination unit 134 determines the material to be recommended based on the calculated distance (step S08). For example, from among multiple materials, materials with a short distance between the coordinates of the emotion word parameter corresponding to the emotion word specified by the user and the coordinates of the material parameter corresponding to the material can be regarded as having a strong relationship with the emotion word, and can be identified in order.
- FIG. 11 is a diagram showing an example of a calculation process of the distance between coordinates performed by the recommended material determination unit 134 of the material recommendation device 100 in one embodiment of the present disclosure.
- FIG. 11 shows a process of subtracting the components of a matrix consisting of the corrected material parameter values calculated by the correction process shown in FIG. 9 from the matrix consisting of the corrected emotion word parameter values calculated by the correction process shown in FIG. 10, squaring each value of the calculated matrix components, and further adding up the calculated values to perform a root calculation to calculate the distance between coordinates.
- the value of the emotion word parameter for "hardness” is "0.03"
- the value of the material parameter for "hardness” is "-0.09"
- the recommended material determination unit 134 adds up the component values calculated for each of the five sense parameters in this way, and further performs a root calculation to calculate the value, which is the distance between the coordinates of the emotion word parameter and the material parameter.
- the calculated distance between the coordinates is, for example, the square root of "4.0938 + ⁇ ".
- the calculation of the distance between the coordinates is not limited to this example, and may be calculated by other methods known in the technical field.
- the recommended material determination unit 134 performs the above calculation for each material and each emotion word, and calculates the distance between the coordinates.
- the recommended material determination unit 134 determines the material to be recommended based on the calculated distance between the coordinates.
- the recommended material determination unit 134 outputs information indicating the recommended material and information indicating the calculated distance between the coordinates for the material to the output unit 140.
- the output unit 140 acquires information indicating the recommended material output from the recommended material determination unit and information indicating the distance between the coordinates calculated for the material.
- the output unit 140 presents the information indicating the recommended material and information indicating the distance between the coordinates calculated for the material to the user (step S09).
- the information output by the output unit 140 includes, for example, the selected product type ("artificial leather for bags"), the selected emotion word (“beautiful”), and the value of the distance between the coordinates for each material (for each product number).
- the output unit 140 outputs the materials in order of the shortest distance between the coordinates, so that the more recommended materials are displayed at the top.
- the distance between the coordinates is also displayed as a bar graph, allowing the user to more intuitively recognize the recommendation level of each material.
- this is not limited to this example, and the distance between the coordinates may be displayed in a predetermined order of the product numbers.
- Figure 13 is a diagram showing another example of information output by the output unit 140 of the material recommendation device 100 in one embodiment of the present disclosure. Here, it is assumed that the user has selected the emotion words "flashy” and "manly.”
- the information output by the output unit 140 includes, for example, the selected product type ("synthetic leather for bags”).
- the information output by the output unit 140 also includes, for example, the value of the distance between the coordinates for each material (for each product number) for each selected emotion word ("flashy” and "manly”).
- the information output by the output unit 140 also includes, for example, the average value of the distance between the coordinates for multiple emotion words for each material.
- the user can recognize that the synthetic leather material for bags that is closest to the impression of "flashy” is the material assigned with the product number "303a201b102" with a coordinate distance of "8.7".
- the user can also recognize that the synthetic leather material for bags that is closest to the impression of "masculine” is the material assigned with the product number "303a201b104" with a coordinate distance of "8.3”.
- the user can also recognize that the synthetic leather material for bags that has both the impression of "flashy” and the impression of "masculine” that are high on average is the material assigned with the product number "303a201b104" with a coordinate distance of "9.4".
- the average value of the three emotion words may be output, or the average value may be output for each combination of two of the three emotion words.
- the output unit 140 may generate and present a map that allows visual understanding of the coordinates of the material parameters and the coordinates of the emotion word parameters.
- the output unit 140 may perform analysis using principal component analysis or multidimensional scaling on the coordinates of the material parameters and the coordinates of the emotion word parameters, and visually present the analysis results.
- principal component analysis the output unit 140 may present the positional relationship between the coordinates of the material parameters and the emotion word parameters using two axes, the first principal component and the second principal component, in a two-dimensional diagram using BiPlot.
- the material recommendation device 100 in one embodiment of the present disclosure includes an input unit 120, a material parameter extraction unit 131, an emotion word parameter extraction unit 132, and a recommended material determination unit 134.
- the input unit 120 accepts input of information indicating a product type and an emotion word indicating a word expressing an emotion.
- the material parameter extraction unit 131 extracts a sensory parameter (material parameter) value for each material corresponding to the product type input to the input unit 120 based on material parameter information in which materials used in products belonging to the product type are associated with sensory parameter values indicating the degree of sensation humans experience from the material.
- the emotion word parameter extraction unit 132 extracts a sensory parameter (emotion word parameter) value corresponding to the emotion word input to the input unit 120 based on emotion word parameter information in which emotion words are associated with sensory parameter values.
- the recommended ingredient determination unit 134 determines ingredients recommended for the emotion word input to the input unit 120 based on the sensory parameter values for each ingredient extracted by the ingredient parameter extraction unit 131 and the sensory parameter values extracted by the emotion word parameter extraction unit 132.
- the material recommendation device 100 can automatically recommend appropriate materials according to a person's emotions, without relying on the sense of a specific person who recommends the materials.
- the material recommendation device 100 corrects the sensory parameter values for each material and the sensory parameter values corresponding to emotion words to take into account the influence of cross-modal effects. With this configuration, the material recommendation device 100 can more accurately recommend appropriate materials according to a person's emotions.
- the modal correction information 113, the usage status input unit 123, and the modal correction unit 133 may be omitted to form a simpler configuration.
- the material recommendation device 100 may determine a recommended material without correcting the sensory parameter values for each material and the sensory parameter values according to the usage status information. In this way, by making the configuration simpler, it is possible to reduce the device cost and operation cost of the material recommendation device 100, and the man-hours required to prepare the modal correction information 113, etc.
- the above-mentioned processing may be performed by recording a program for implementing the functions of the material recommendation device 100 in the embodiment on a computer-readable recording medium, and having the computer system read and execute the program recorded on the recording medium.
- computer system includes hardware such as an OS and peripheral devices.
- computer system also includes a WWW system equipped with a homepage providing environment (or display environment).
- computer-readable recording medium refers to portable media such as flexible disks, optical magnetic disks, ROMs, and CD-ROMs, and storage devices such as hard disks built into a computer system.
- computer-readable recording medium also refers to storage devices that hold a program for a certain period of time, such as volatile memory (RAM) inside a computer system that becomes a server or client when a program is transmitted via a network such as the Internet or a communication line such as a telephone line.
- RAM volatile memory
- the above program may also be transmitted from a computer system in which the program is stored in a storage device or the like to another computer system via a transmission medium, or by transmission waves in the transmission medium.
- the "transmission medium” that transmits the program refers to a medium that has the function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line.
- the above program may also be one that realizes part of the above-mentioned functions. Furthermore, it may be a so-called difference file (difference program) that can realize the above-mentioned functions in combination with a program already recorded in the computer system.
- Reference Signs List 100 Material recommendation device 110: Parameter information storage unit 111: Material parameter information 112: Emotion word parameter information 113: Modal correction information 120: Input unit 121: Product type input unit 122: Emotion word input unit 123: Usage status input unit 131: Material parameter extraction unit 132: Emotion word parameter extraction unit 133: Modal correction unit 134: Recommended material determination unit 140: Output unit
Landscapes
- Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
This material recommendation device comprises: an input unit that receives the input of information indicating a product type and emotional language that indicates language representing an emotion; a material parameter extraction unit that extracts the value of a sensation parameter for each material corresponding to the product type on the basis of material parameter information in which the material used in a product belonging to the product type and the value of the sensation parameter, which indicates the degree of sensation a human receives with respect to the material, are associated with each other; an emotional language parameter extraction unit that extracts the value of the sensation parameter corresponding to the emotional language on the basis of emotional language parameter information in which the emotional language and the value of the sensation parameter are associated with each other; and a recommended material determination unit that determines a recommended material for the emotional language on the basis of the value of the sensation parameter for each material extracted by the material parameter extraction unit and the value of the sensation parameter extracted by the emotional language parameter extraction unit.
Description
本発明は、材料推奨装置、材料推奨方法及びプログラムに関する。
本願は、2022年9月26日に、日本に出願された特願2022-152385号に基づき優先権を主張し、その内容をここに援用する。 The present invention relates to a material recommendation device, a material recommendation method, and a program.
This application claims priority based on Japanese Patent Application No. 2022-152385, filed on September 26, 2022, the contents of which are incorporated herein by reference.
本願は、2022年9月26日に、日本に出願された特願2022-152385号に基づき優先権を主張し、その内容をここに援用する。 The present invention relates to a material recommendation device, a material recommendation method, and a program.
This application claims priority based on Japanese Patent Application No. 2022-152385, filed on September 26, 2022, the contents of which are incorporated herein by reference.
利用者が製品を選ぶ際に評価基準とする指標として、例えば製品に用いられた材料の特性、及びその性能が挙げられる。その他にも、その材料が五感(視覚、触覚、聴覚、味覚、嗅覚)によってどのように感じられるものであるか、及び、その材料が感性にどのように訴えるものであるか、といった利用者が受ける印象も、利用者が製品を選ぶ際に評価基準とする重要な指標の一つである。
Indicators that users use as evaluation criteria when selecting a product include, for example, the characteristics and performance of the materials used in the product. In addition, the impression that users get, such as how the material is perceived through the five senses (sight, touch, hearing, taste, smell) and how the material appeals to the senses, are also important indicators that users use as evaluation criteria when selecting a product.
従来、利用者が所望する感覚を発現する材料を精度高く提示することは困難であった。利用者が所望する感覚に応じた材料を推奨するためには、例えば、材料を推奨しようとする担当者が利用者に対して十分なヒアリング等を実施する必要がある。しかしながら、その場合、利用者が所望する感覚を発現する材料が適切に推奨されるか否かについては、担当者のセンスにも大きく左右される。例えば、利用者が「心地よい」材料を求めていたとしても、利用者が考える心地よさと担当者が考える心地よさとは、必ずしも一致しているとは限らないからである。
Previously, it was difficult to accurately present materials that would produce the sensations desired by the user. In order to recommend materials that match the sensations desired by the user, for example, the person in charge of recommending the materials needs to thoroughly interview the user. In such cases, however, whether or not materials that produce the sensations desired by the user are appropriately recommended also depends heavily on the sense of the person in charge. For example, even if the user is looking for "comfortable" materials, what the user considers comfortable does not necessarily match what the person in charge considers comfortable.
このような課題に対し、例えば特許文献1に記載の近似触感材料推奨システムは、触感を表現する用語(オノマトペ)と材料の表面の物理特性との対応を示すパラメータ値を予め保持しておき、当該パラメータ値に基づいて、入力されたオノマトペに対して推奨される材料を自動的に提示する。
To address this issue, for example, the similar texture material recommendation system described in Patent Document 1 stores parameter values in advance that indicate the correspondence between terms (onomatopoeia) that express texture and the physical properties of the surface of a material, and automatically presents recommended materials for an input onomatopoeia based on the parameter values.
特許文献1に記載の近似触感材料推奨システムは、触覚を表すオノマトペによる入力のみに対応したシステムである。しかしながら実際には、利用者が材料から受ける印象は、必ずしも触覚によって得られる印象であるとは限らない。利用者が材料から受ける印象には、五感のうち複数の感覚によって、あるいは全ての感覚によってそれぞれ得られた印象が含まれていることもある。
The similar tactile material recommendation system described in Patent Document 1 is a system that only accepts input using onomatopoeia that expresses the sense of touch. In reality, however, the impression a user gets from a material is not necessarily an impression obtained through the sense of touch. The impression a user gets from a material may include impressions obtained through multiple or all of the five senses.
また、利用者が、材料から受ける印象を言葉で表現する場合、単に五感によって得られる印象を疑似的に表現するオノマトペ(例えば、「さらさら」、「つるつる」等)であるとは限らない。利用者は、材料から受ける印象を、情動を表現する言葉で表現することも少なくない。ここでいう情動を表現する言葉とは、一般的な官能評価において人の感覚を数値化する場合に用いられる形容詞等である。このような形容詞には、例えば「好きな」、「風変りな」、「心地よい」、及び「高級感がある」等の、五感の中の特定の感覚のみにとらわれないものもある。以上のようなことから、従来の技術では、人の情動に応じて材料を推奨することが難しいという課題があった。
In addition, when users express the impression they get from an ingredient in words, they do not necessarily use onomatopoeias (such as "smooth" or "slippery") that simply mimic the impression they get from the five senses. Users often use words that express emotions to describe the impression they get from an ingredient. Words that express emotions here are adjectives that are used to quantify human sensations in general sensory evaluations. Some of these adjectives are not limited to a specific sensation among the five senses, such as "favorite," "unique," "comfortable," and "luxurious." For these reasons, conventional technology has had the problem of making it difficult to recommend ingredients based on people's emotions.
本発明は、このような状況に鑑みてなされたものであり、人の情動に応じて材料を推奨することができる材料推奨装置、材料推奨方法及びプログラムを提供することにある。
The present invention was made in consideration of these circumstances, and aims to provide an ingredient recommendation device, ingredient recommendation method, and program that can recommend ingredients according to a person's emotions.
本開示の内容は、以下の実施態様を含む。
[1] 製品種別と、情動を表現する言葉を示す情動言葉と、を示す情報の入力を受け付ける入力部と、前記製品種別に属する製品に用いられる材料と前記材料に対して人間が受ける感覚の度合いを示す感覚パラメータの値とが対応付けられた材料パラメータ情報に基づいて、前記入力部に入力された前記製品種別に対応する前記材料ごとの感覚パラメータの値を抽出する材料パラメータ抽出部と、前記情動言葉と前記感覚パラメータの値とが対応付けられた情動言葉パラメータ情報に基づいて、前記入力部に入力された前記情動言葉に対応する感覚パラメータの値を抽出する情動言葉パラメータ抽出部と、前記材料パラメータ抽出部によって抽出された前記材料ごとの感覚パラメータの値と、前記情動言葉パラメータ抽出部によって抽出された前記感覚パラメータの値と、に基づいて、前記入力部に入力された前記情動言葉に対して推奨される前記材料を決定する推奨材料決定部と、を備える材料推奨装置。 The subject matter of the present disclosure includes the following embodiments.
[1] An ingredient recommendation device comprising: an input unit that accepts input of information indicating a product type and an emotion word indicating a word expressing an emotion; an ingredient parameter extraction unit that extracts a sensory parameter value for each ingredient corresponding to the product type input to the input unit based on ingredient parameter information in which materials used in products belonging to the product type are associated with sensory parameter values indicating the degree of sensation a human feels for the material; an emotion word parameter extraction unit that extracts a sensory parameter value corresponding to the emotion word input to the input unit based on emotion word parameter information in which the emotion word is associated with the sensory parameter values; and a recommended ingredient determination unit that determines the ingredient recommended for the emotion word input to the input unit based on the sensory parameter value for each ingredient extracted by the ingredient parameter extraction unit and the sensory parameter value extracted by the emotion word parameter extraction unit.
[1] 製品種別と、情動を表現する言葉を示す情動言葉と、を示す情報の入力を受け付ける入力部と、前記製品種別に属する製品に用いられる材料と前記材料に対して人間が受ける感覚の度合いを示す感覚パラメータの値とが対応付けられた材料パラメータ情報に基づいて、前記入力部に入力された前記製品種別に対応する前記材料ごとの感覚パラメータの値を抽出する材料パラメータ抽出部と、前記情動言葉と前記感覚パラメータの値とが対応付けられた情動言葉パラメータ情報に基づいて、前記入力部に入力された前記情動言葉に対応する感覚パラメータの値を抽出する情動言葉パラメータ抽出部と、前記材料パラメータ抽出部によって抽出された前記材料ごとの感覚パラメータの値と、前記情動言葉パラメータ抽出部によって抽出された前記感覚パラメータの値と、に基づいて、前記入力部に入力された前記情動言葉に対して推奨される前記材料を決定する推奨材料決定部と、を備える材料推奨装置。 The subject matter of the present disclosure includes the following embodiments.
[1] An ingredient recommendation device comprising: an input unit that accepts input of information indicating a product type and an emotion word indicating a word expressing an emotion; an ingredient parameter extraction unit that extracts a sensory parameter value for each ingredient corresponding to the product type input to the input unit based on ingredient parameter information in which materials used in products belonging to the product type are associated with sensory parameter values indicating the degree of sensation a human feels for the material; an emotion word parameter extraction unit that extracts a sensory parameter value corresponding to the emotion word input to the input unit based on emotion word parameter information in which the emotion word is associated with the sensory parameter values; and a recommended ingredient determination unit that determines the ingredient recommended for the emotion word input to the input unit based on the sensory parameter value for each ingredient extracted by the ingredient parameter extraction unit and the sensory parameter value extracted by the emotion word parameter extraction unit.
[2] 前記情動言葉は、SD(Semantic Differential)法による印象評価において定義されるEPA(Evaluation, Potency, Activity)構造の因子のうち、評価性因子又は活動性因子に分類される語句である[1]に記載の材料推奨装置。
[2] The material recommendation device described in [1], wherein the emotion words are words classified as evaluation factors or activity factors among factors of the EPA (Evaluation, Potency, Activity) structure defined in impression evaluation by the SD (Semantic Differential) method.
[3] 前記感覚パラメータの値を、複数種類の感覚でそれぞれ受ける印象が互いに関係しあうクロスモーダル効果を考慮する係数を用いて補正する補正部をさらに備える[1]又は[2]に記載の材料推奨装置。
[3] The material recommendation device according to [1] or [2], further comprising a correction unit that corrects the values of the sensory parameters using coefficients that take into account the cross-modal effect in which impressions received through multiple types of senses are related to each other.
[4] 前記係数には、前記複数種類の感覚によってそれぞれ印象を受ける際の順序に応じて異なる値が設定されており、前記補正部は、前記順序を示す情報を取得し、取得された前記順序に対応する前記係数を用いて補正する[3]に記載の材料推奨装置。
[4] The material recommendation device described in [3], in which the coefficients are set to different values depending on the order in which impressions are received by the multiple types of senses, and the correction unit acquires information indicating the order and performs correction using the coefficients corresponding to the acquired order.
[5] 前記材料ごとの感覚パラメータの値の補正に用いられる前記係数には、先に受ける前記感覚の場合には1の値が設定され、後から受ける感覚の場合には0より大きく1未満の値が設定され、それ以外の感覚の場合には0が設定されている[4]に記載の材料推奨装置。
[5] The material recommendation device described in [4], in which the coefficient used to correct the value of the sensory parameter for each material is set to a value of 1 for the sensation received first, a value greater than 0 and less than 1 for the sensation received later, and a value of 0 for any other sensation.
[6] 前記情動言葉の感覚パラメータの値の補正に用いられる前記係数には、前記クロスモーダル効果が生じる前記感覚の場合には1の値が設定され、前記クロスモーダル効果が生じない感覚の前記感覚の場合には0の値が設定されている[4]又は[5]に記載の材料推奨装置。
[6] The material recommendation device according to [4] or [5], in which the coefficients used to correct the values of the sensory parameters of the emotional words are set to a value of 1 for the senses in which the cross-modal effect occurs, and are set to a value of 0 for the senses in which the cross-modal effect does not occur.
[7] 前記感覚とは、五感又は前記五感がさらに細分化された感覚である[1]~[6]のいずれかに記載の材料推奨装置。
[7] The material recommendation device according to any one of [1] to [6], wherein the senses are the five senses or a further subdivided sense of the five senses.
[8] 製品種別と、情動を表現する言葉を示す情動言葉と、を示す情報の入力を受け付ける入力ステップと、前記製品種別に属する製品に用いられる材料と前記材料に対して人間が受ける感覚の度合いを示す感覚パラメータの値とが対応付けられた材料パラメータ情報に基づいて、前記入力ステップにおいて入力された前記製品種別に対応する前記材料ごとの感覚パラメータの値を抽出する材料パラメータ抽出ステップと、前記情動言葉と前記感覚パラメータの値とが対応付けられた情動言葉パラメータ情報に基づいて、前記入力ステップにおいて入力された前記情動言葉に対応する感覚パラメータの値を抽出する情動言葉パラメータ抽出ステップと、前記材料パラメータ抽出ステップにおいて抽出された前記材料ごとの感覚パラメータの値と、前記情動言葉パラメータ抽出ステップにおいて抽出された前記感覚パラメータの値と、に基づいて、前記入力ステップにおいて入力された前記情動言葉に対して推奨される前記材料を決定する推奨材料決定ステップと、を有する材料推奨方法。
[8] A material recommendation method comprising: an input step of accepting input of information indicating a product type and an emotion word indicating a word expressing an emotion; a material parameter extraction step of extracting a sensory parameter value for each material corresponding to the product type input in the input step based on material parameter information in which materials used in products belonging to the product type are associated with sensory parameter values indicating the degree of sensation a human feels for the material; an emotion word parameter extraction step of extracting a sensory parameter value corresponding to the emotion word input in the input step based on emotion word parameter information in which the emotion word is associated with the sensory parameter value; and a recommended material determination step of determining the material recommended for the emotion word input in the input step based on the sensory parameter value for each material extracted in the material parameter extraction step and the sensory parameter value extracted in the emotion word parameter extraction step.
[9] コンピュータに、製品種別と、情動を表現する言葉を示す情動言葉と、を示す情報の入力を受け付ける入力ステップと、前記製品種別に属する製品に用いられる材料と前記材料に対して人間が受ける感覚の度合いを示す感覚パラメータの値とが対応付けられた材料パラメータ情報に基づいて、前記入力ステップにおいて入力された前記製品種別に対応する前記材料ごとの感覚パラメータの値を抽出する材料パラメータ抽出ステップと、前記情動言葉と前記感覚パラメータの値とが対応付けられた情動言葉パラメータ情報に基づいて、前記入力ステップにおいて入力された前記情動言葉に対応する感覚パラメータの値を抽出する情動言葉パラメータ抽出ステップと、前記材料パラメータ抽出ステップにおいて抽出された前記材料ごとの感覚パラメータの値と、前記情動言葉パラメータ抽出ステップにおいて抽出された前記感覚パラメータの値と、に基づいて、前記入力ステップにおいて入力された前記情動言葉に対して推奨される前記材料を決定する推奨材料決定ステップと、を実行させるためのプログラム。
[9] A program for causing a computer to execute the following steps: an input step of accepting input of information indicating a product type and an emotion word indicating a word expressing an emotion; a material parameter extraction step of extracting a sensory parameter value for each material corresponding to the product type input in the input step based on material parameter information in which materials used in products belonging to the product type are associated with sensory parameter values indicating the degree of sensation a human feels for the material; an emotion word parameter extraction step of extracting a sensory parameter value corresponding to the emotion word input in the input step based on emotion word parameter information in which the emotion word is associated with the sensory parameter value; and a recommended material determination step of determining the material recommended for the emotion word input in the input step based on the sensory parameter value for each material extracted in the material parameter extraction step and the sensory parameter value extracted in the emotion word parameter extraction step.
本発明によれば、人の情動に応じて材料を推奨することを可能にする。
The present invention makes it possible to recommend ingredients based on a person's emotions.
以下、本開示の一実施形態について図面を参照しながら説明する。
Below, one embodiment of the present disclosure will be described with reference to the drawings.
[材料推奨装置の構成]
以下、実施形態の材料推奨装置100の構成について説明する。図1は、本開示の一実施形態における材料推奨装置100の機能構成を示すブロック図である。 [Configuration of material recommendation device]
The following describes the configuration of thematerial recommendation device 100 according to the embodiment. Fig. 1 is a block diagram showing the functional configuration of the material recommendation device 100 according to an embodiment of the present disclosure.
以下、実施形態の材料推奨装置100の構成について説明する。図1は、本開示の一実施形態における材料推奨装置100の機能構成を示すブロック図である。 [Configuration of material recommendation device]
The following describes the configuration of the
材料推奨装置100は、例えば汎用コンピュータ等を用いて構成される情報処理装置である。材料推奨装置100は、製品に用いられる材料に対して利用者が所望する感覚を表す言葉であって、利用者の情動を表現する言葉(以下、「情動言葉」という。)の入力を受け付ける。材料推奨装置100は、入力された情動言葉に基づいて、少なくとも1つの推奨される材料を決定し、決定された材料を示す情報を出力する。
The material recommendation device 100 is an information processing device configured using, for example, a general-purpose computer. The material recommendation device 100 accepts input of words expressing the user's desired feeling for the materials used in the product and expressing the user's emotions (hereinafter referred to as "emotion words"). The material recommendation device 100 determines at least one recommended material based on the input emotion words, and outputs information indicating the determined material.
図1は、本開示の第1の実施形態における材料推奨装置100の機能構成を示すブロック図である。図1に示されるように、材料推奨装置100は、パラメータ情報記憶部110と、入力部120と、材料パラメータ抽出部131と、情動言葉パラメータ抽出部132と、モーダル補正部133と、推奨材料決定部134と、出力部140とを含んで構成される。
FIG. 1 is a block diagram showing the functional configuration of a material recommendation device 100 in a first embodiment of the present disclosure. As shown in FIG. 1, the material recommendation device 100 includes a parameter information storage unit 110, an input unit 120, a material parameter extraction unit 131, an emotion word parameter extraction unit 132, a modal correction unit 133, a recommended material determination unit 134, and an output unit 140.
パラメータ情報記憶部110は、コンピュータ読み出し可能な記憶媒体を用いて構成される。ここでいうコンピュータ読み出し可能な記憶媒体とは、例えば、磁気ハードディスク装置、又は半導体記憶装置等である。パラメータ情報記憶部110は、推奨される材料を決定するために用いられる各種のパラメータ情報を予め記憶している。図1に示されるように、パラメータ情報記憶部110は、材料パラメータ情報111と、情動言葉パラメータ情報112と、モーダル補正情報113とを少なくとも記憶している。各パラメータ情報の詳細については後に詳しく説明する。
The parameter information storage unit 110 is configured using a computer-readable storage medium. The computer-readable storage medium referred to here is, for example, a magnetic hard disk device or a semiconductor storage device. The parameter information storage unit 110 pre-stores various types of parameter information used to determine recommended materials. As shown in FIG. 1, the parameter information storage unit 110 stores at least material parameter information 111, emotion word parameter information 112, and modal correction information 113. Details of each piece of parameter information will be explained later.
入力部120は、利用者による入力操作を受け付ける入力装置を用いて構成される。ここでいう入力装置とは、例えば、キーボード、マウス、タッチパネル、及び入力ボタン等である。図1に示されるように、入力部120は、製品種別入力部121と、情動言葉入力部122、使用状況入力部123とを含んで構成される。
The input unit 120 is configured using an input device that accepts input operations by a user. The input device here is, for example, a keyboard, a mouse, a touch panel, and an input button. As shown in FIG. 1, the input unit 120 is configured to include a product type input unit 121, an emotion word input unit 122, and a usage status input unit 123.
製品種別入力部121は、利用者によって入力された製品種別を示す情報(以下、「製品種別情報」という。)を取得する。製品種別入力部121は、取得された製品種別情報を材料パラメータ抽出部131へ出力する。ここでいう製品種別情報とは、例えば、「食品用包装材」、「住宅用建材」、及び「鞄用合皮材」等の、製品に用いられる材料の種別を示す情報である。
The product type input unit 121 acquires information indicating the product type input by the user (hereinafter referred to as "product type information"). The product type input unit 121 outputs the acquired product type information to the material parameter extraction unit 131. The product type information here is information indicating the type of material used in the product, such as, for example, "food packaging material," "housing building material," and "synthetic leather material for bags."
情動言葉入力部122は、利用者によって入力された情動言葉を示す情報(以下、単に「情動言葉」という。)を取得する。情動言葉入力部122は、取得された情動言葉を情動言葉パラメータ抽出部132へ出力する。ここでいう情動言葉とは、一般的な官能評価において人の感覚を数値化する場合に用いられる形容詞等の語句である。このような形容詞には、例えば「好きな」、「風変りな」、「心地よい」、及び「高級感がある」等の、五感の中の特定の感覚のみにとらわれない語句もある。
The emotion word input unit 122 acquires information indicating emotion words input by the user (hereinafter simply referred to as "emotion words"). The emotion word input unit 122 outputs the acquired emotion words to the emotion word parameter extraction unit 132. The emotion words referred to here are words such as adjectives used to quantify human sensations in general sensory evaluations. Such adjectives include words that are not limited to a specific sensation among the five senses, such as "favorite," "unique," "comfortable," and "luxurious."
なお、一般的な官能評価において、人の情動が数値化して評価される場合、SD(Semantic Differential)法による評価が用いられることが多い。SD法とは、反対の意味を持つ、対となる形容詞を用いて、対象物に対する人の印象(感性)を測定し、数値的に評価する手法である。SD法の特徴は、少数の因子で印象の構造を表現できる点にある、例えば、SD法では、評価性因子、活動性因子、及び力量性因子と解釈される3つの基本的な因子からなる構造が見出される。この構造は、EPA(Evaluation, Potency, Activity)構造と呼ばれる。
In general sensory evaluation, when human emotions are quantified and evaluated, the Semantic Differential (SD) method is often used. The SD method is a technique for measuring a person's impression (sensitivity) of an object using paired adjectives with opposite meanings, and evaluating it numerically. The SD method is characterized by its ability to express the structure of an impression with a small number of factors. For example, the SD method finds a structure consisting of three basic factors that can be interpreted as an evaluation factor, an activity factor, and a potency factor. This structure is called the EPA (Evaluation, Potency, Activity) structure.
評価性(Evaluation)因子は、「好き-嫌い」、「良い-悪い」、及び「美しい-醜い」といった、対象の価値に関する総合的な評価を表す形容詞の対によって表現される。そのほか、評価性因子は、評価を行う対象の種類(形態、色、及び音等)によって異なるが、「安定した-不安定な」、「澄んだ-濁った」、及び「暖かい-冷たい」といった、対象の性質について表現する形容詞の対によっても構成される。
Evaluation factors are expressed by pairs of adjectives that express a comprehensive evaluation of the value of an object, such as "like-dislike," "good-bad," and "beautiful-ugly." In addition, evaluation factors are also composed of pairs of adjectives that express the nature of an object, such as "stable-unstable," "clear-muddy," and "warm-cold," although this differs depending on the type of object being evaluated (shape, color, sound, etc.).
活動性(Activity)因子と力量性(Potency)因子とは、ともに対象の性質についての表現によって構成される。活動性因子は、「騒がしい-静かな」、「動的な-静的な」、及び「派手な-地味な」等の形容詞の対によって表現される。力量性因子は、「硬い-柔らかい」、「鋭い-鈍い」、及び「緊張した-緩んだ」等の形容詞の対によって表現される。
Both activity and potency factors are composed of expressions about the properties of an object. Activity factors are expressed by adjective pairs such as "noisy-quiet," "dynamic-static," and "flashy-subdued." Potency factors are expressed by adjective pairs such as "hard-soft," "sharp-dull," and "tense-relaxed."
図2は、本開示の一実施形態における材料推奨装置100に入力される情動言葉の一例を示す図である。上記の3つの基本的な因子のうち評価性因子及び活動性因子は、人の情動を直接的に表す形容詞が多く含まれる。本実施形態では、一例として、SD法による印象評価おいて定義されるEPA構造の因子のうち、評価性因子及び活動性因子に分類される語句を、情動言葉として定義する。なお、形容詞の対を情動言葉として定義するのではなく、対となる形容詞の双方を別々に1つの情動言葉として定義する。
FIG. 2 is a diagram showing an example of emotion words input to the material recommendation device 100 in one embodiment of the present disclosure. Of the three basic factors, the evaluative factor and the active factor contain many adjectives that directly express human emotions. In this embodiment, as an example, among the factors of the EPA structure defined in impression evaluation using the SD method, words classified as the evaluative factor and the active factor are defined as emotion words. Note that rather than defining pairs of adjectives as emotion words, both pairs of adjectives are defined separately as one emotion word.
使用状況入力部123は、利用者によって入力された、製品の使用状況を示す情報(以下、「使用状況情報」という。)を取得する。使用状況入力部123は、取得された使用状況情報をモーダル補正部133へ出力する。
The usage status input unit 123 acquires information indicating the usage status of the product (hereinafter referred to as "usage status information") input by the user. The usage status input unit 123 outputs the acquired usage status information to the modal correction unit 133.
ここでいう使用状況情報とは、利用者が、製品から印象を受ける際に、五感のうちどの感覚によって印象を受ける状況であるかを特定することができる情報である。また、使用状況情報とは、利用者が、五感のうち複数の感覚によってそれぞれ印象を受ける場合には、どのような順序で各感覚による印象を受ける状況であるかを特定することができる情報である。例えば、使用状況情報とは、利用者が、五感のうち視覚と触覚の2つの感覚によってそれぞれ印象を受ける場合には、先に製品を見て、後に製品を触る状況であるのか、又は、先に製品を触って、後に製品を見る状況であるのかを特定することができる情報である。
The usage information referred to here is information that can identify which of the five senses the user receives when receiving an impression from a product. Furthermore, when a user receives impressions through more than one of the five senses, the usage information is information that can identify the order in which the impressions are received through each sense. For example, when a user receives impressions through two of the five senses, sight and touch, the usage information is information that can identify whether the user sees the product first and then touches it, or whether the user touches the product first and then sees it.
材料パラメータ抽出部131、情動言葉パラメータ抽出部132、モーダル補正部133、及び推奨材料決定部134は、例えば記憶部(不図示)に予め記憶されたプログラムをCPU(Central Processing Unit)等のプロセッサが読み出して実行することによって実現される機能部である。なお、上記の記憶部とパラメータ情報記憶部110とは、同一の記憶媒体を用いて構成されてもよい。
The material parameter extraction unit 131, the emotion word parameter extraction unit 132, the modal correction unit 133, and the recommended material determination unit 134 are functional units that are realized by a processor such as a CPU (Central Processing Unit) reading and executing a program that is pre-stored in a storage unit (not shown), for example. Note that the above storage unit and the parameter information storage unit 110 may be configured using the same storage medium.
材料パラメータ抽出部131は、製品種別入力部121から出力された製品種別情報を取得する。また、材料パラメータ抽出部131は、パラメータ情報記憶部110に記憶された材料パラメータ情報111を参照する。材料パラメータ抽出部131は、取得された製品種別情報に対応する製品種別に対応付けられた材料パラメータの値を材料パラメータ情報111から抽出する。
The material parameter extraction unit 131 acquires the product type information output from the product type input unit 121. The material parameter extraction unit 131 also refers to the material parameter information 111 stored in the parameter information storage unit 110. The material parameter extraction unit 131 extracts, from the material parameter information 111, the values of the material parameters associated with the product type that corresponds to the acquired product type information.
材料パラメータ情報111は、材料と五感パラメータとの関連性の度合いを示す情報である。また、ここでいう五感パラメータとは、五感を示す項目、あるいは五感の各々が更に細分化された感覚を示す項目によって構成される。なお、本実施形態では、五感パラメータは、五感の各々が更に細分化された感覚を示す項目によって構成されているものとする。
The material parameter information 111 is information that indicates the degree of association between a material and the five sense parameters. The five sense parameters referred to here are composed of items that indicate the five senses, or items that indicate sensations that are further subdivided from each of the five senses. In this embodiment, the five sense parameters are composed of items that indicate sensations that are further subdivided from each of the five senses.
図3は、本開示の一実施形態における材料推奨装置100において用いられる五感パラメータの一例を示す図である。図3に示されるように、触覚は、例えば、「硬さ」、「粗さ」、「摩擦」、及び「温冷」という五感パラメータに更に分類される。また、視覚は、例えば、「色1」、「色2」、「色3」、「質感1」、「質感2」、及び「質感3」という五感パラメータに更に分類される。
FIG. 3 is a diagram showing an example of the five sense parameters used in the material recommendation device 100 in one embodiment of the present disclosure. As shown in FIG. 3, touch is further classified into the five sense parameters of "hardness," "roughness," "friction," and "warm/cold." Vision is further classified into the five sense parameters of "color 1," "color 2," "color 3," "texture 1," "texture 2," and "texture 3."
なお、触覚に関しては、硬さ、粗さ、摩擦、及び温冷の4つの要素に分類できることが知られている。また、具体的には、例えば触覚の「粗さ」の五感パラメータとして、規格(JIS B 0601)によって規定された平均粗さRa及び最大高さRzの値が用いられてもよい。あるいは、これら平均粗さRa及び最大高さRz等を主成分分析又は因子分析して次元を落としたパラメータと値が用いられてもよい。また、例えば、触覚の「硬さ」の五感パラメータとして、押し込む力と変位に基づいて算出されるヤング率の値が用いられてもよい。また、例えば、触覚の「摩擦」の五感パラメータとして、材料に触れた際の動摩擦係数の値等が用いられてもよい。
It is known that touch can be classified into four elements: hardness, roughness, friction, and warmth/coldness. More specifically, for example, the values of average roughness Ra and maximum height Rz defined by the standard (JIS B 0601) may be used as the five sensory parameters for touch "roughness". Alternatively, parameters and values obtained by subjecting the average roughness Ra and maximum height Rz, etc. to principal component analysis or factor analysis may be used to reduce the dimensions. For example, the value of Young's modulus calculated based on the pressing force and displacement may be used as the five sensory parameters for touch "hardness". For example, the value of the dynamic friction coefficient when touching a material may be used as the five sensory parameters for touch "friction".
なお、材料推奨装置100を様々な材料に対応させる場合、触覚に関しては、例えばウェアラブル型の装置等を用いて、全ての材料において統一的に評価されたパラメータ値が取得されることが望ましい。ウェアラブル型の装置を人の手に装着して実際に材料に触れることで、摩擦及び硬さ等を統一的に測定することができる。
When the material recommendation device 100 is adapted to various materials, it is desirable to obtain uniformly evaluated parameter values for all materials with respect to touch, for example using a wearable device. By attaching a wearable device to a person's hand and actually touching the material, friction, hardness, etc. can be uniformly measured.
なお、視覚に関しては、図3に示される視覚の五感パラメータには、「色1」、「色2」、及び「色3」が含まれているが、一般的に、色は、例えばL*a*b*又はL*C*Hのように3つの変数で表現できることが知られている。例えば、「色1」、「色2」、及び「色3」の五感パラメータとして、これら3つの変数をそれぞれ用いるようにしてもよい。なお、材料推奨装置100は、どの色空間の3つの変数を用いても構わないが、全ての材料について同一の色空間の3つの変数を用いて評価する必要がある。
With regard to vision, the five sense parameters of vision shown in FIG. 3 include "Color 1", "Color 2", and "Color 3", but it is generally known that color can be expressed using three variables, such as L*a*b* or L*C*H. For example, these three variables may be used as the five sense parameters of "Color 1", "Color 2", and "Color 3", respectively. Note that the material recommendation device 100 may use the three variables in any color space, but it is necessary to evaluate all materials using the three variables in the same color space.
なお、視覚に関しては、触覚のように明確に有限の要素に分類できることは知られていない。但し、昨今、色及び質感の要素は、人の視覚的な印象に影響を与える重要な要素であることが分かってきた。ここで質感とは、双方向反射率分布関数BRDFに代表されるような、反射率に関する物性情報である。BRDF(Bidirectional Reflectance Distribution Function)は、ある特定の角度から光を入射した時の反射光の角度分布特性である。一般的に、CG(Computer Graphics)では、このBRDFのパラメータを用いたレンダリングによって材料の質感が再構成される。もちろん、これらのパラメータに関しても、主成分分析又は因子分析して次元を落としたパラメータと値が用いられてもよい。
It is not known that vision can be clearly classified into a finite number of elements, as is the case with touch. However, it has recently become clear that color and texture elements are important factors that affect people's visual impressions. Here, texture refers to physical property information related to reflectance, such as the bidirectional reflectance distribution function (BRDF). BRDF (Bidirectional Reflectance Distribution Function) is the angular distribution characteristic of reflected light when light is incident from a specific angle. Generally, in CG (Computer Graphics), the texture of a material is reconstructed by rendering using the parameters of this BRDF. Of course, with regard to these parameters, parameters and values that have been reduced in dimension by principal component analysis or factor analysis may also be used.
なお、味覚に関しては、味覚化学成分の種類及びその量等が、人の印象に影響を与える要素であると考えられる。また、嗅覚に関しては、嗅覚化学成分の種類及びその濃度が、人の印象に影響を与える要素であると考えられる。また、聴覚に関しては、信号強度及びパワースペクトル等が、人の印象に影響を与える要素であると考えられる。もちろん、これらのパラメータに関しても、主成分分析又は因子分析して次元を落としたパラメータと値が用いられてもよい。
With regard to taste, the type and amount of taste chemical components are thought to be factors that affect people's impressions. With regard to smell, the type and concentration of olfactory chemical components are thought to be factors that affect people's impressions. With regard to hearing, the signal strength and power spectrum are thought to be factors that affect people's impressions. Of course, with regard to these parameters as well, parameters and values that have been reduced in dimension by principal component analysis or factor analysis may be used.
なお、材料パラメータの値は、例えば、何らかの測定方法によって得られた実際の計測値そのものであってもよいし、全ての材料の材料パラメータの値が標準化された値であってもよい。但し、実際の計測値そのものが用いられる場合には、全ての材料についての材料パラメータの値の単位を統一しておく必要がある。例えば、「粗さ」の材料パラメータとして規格によって規定された平均粗さRaの値を用いる場合には、全ての材料の「粗さ」の材料パラメータの値を、ミリメートル又はマイクロメートル等の単位の値に統一すればよい。
The value of the material parameter may be, for example, the actual measurement value obtained by some measurement method, or the material parameter value of all materials may be a standardized value. However, if the actual measurement value is used, the unit of the material parameter value for all materials must be unified. For example, if the average roughness Ra value specified by the standard is used as the material parameter of "roughness", the material parameter value of "roughness" for all materials should be unified to a unit value such as millimeters or micrometers.
なお、食用品包装材、住宅用建材、及び鞄用合皮材等の工業製品に用いられる材料から受ける印象においては、五感のうち、とくに視覚及び触覚による印象が大きな比重を占めていると考えられる。
In addition, among the five senses, the impressions made by the materials used in industrial products such as food packaging materials, building materials for homes, and synthetic leather for bags are thought to be weighted heavily by the senses of sight and touch.
図4は、本開示の一実施形態における材料推奨装置100のパラメータ情報記憶部110に記憶された材料パラメータ情報111の一例を示す図である。材料パラメータ情報111は、材料と前述の五感パラメータとの関連性を示すデータである。以下、材料に対応付けられた五感パラメータを「材料パラメータ」という。
FIG. 4 is a diagram showing an example of material parameter information 111 stored in the parameter information storage unit 110 of the material recommendation device 100 in one embodiment of the present disclosure. The material parameter information 111 is data indicating the association between a material and the five sensory parameters described above. Hereinafter, the five sensory parameters associated with a material will be referred to as "material parameters."
図4に示される材料パラメータ情報111では、製品種別と、製品番号と、製品の材料構成を示す情報と、材料パラメータとが互いに対応付けられている。材料パラメータ情報111の各行が、それぞれ1つの材料に関する情報である。
In the material parameter information 111 shown in FIG. 4, the product type, product number, information indicating the material composition of the product, and material parameters are associated with each other. Each row of the material parameter information 111 is information about one material.
例えば、図4に示される材料パラメータ情報111に1行目には、「製品種別」の項目に「食品用梱包材」という値が登録され、「製品番号」の項目に「101a001b001」という値が設定されている。これは、「101a001b001」という製品番号が付与された材料が、食品用梱包材に用いられる材料であることを表している。また、図4に示される材料パラメータ情報111に1行目には、「製品の材料構成」の項目の「主構成物」の項目に「101」という値が設定され、「添加物1」の項目に「a001」という値が設定され、「添加物2」の項目に「b001」という値が設定されている。これは、「101a001b001」という製品番号が付与された材料が、「101」の識別情報が付与された主構成物、「a001」の識別情報が付与された添加剤、及び「b001」の識別情報が付与された添加剤等を含む材料であることを表している。
For example, in the first line of the material parameter information 111 shown in FIG. 4, the value "food packaging material" is registered in the "product type" field, and the value "101a001b001" is set in the "product number" field. This indicates that the material to which the product number "101a001b001" is assigned is a material used for food packaging material. Also, in the first line of the material parameter information 111 shown in FIG. 4, the value "101" is set in the "main component" field of the "product material composition" field, the value "a001" is set in the "additive 1" field, and the value "b001" is set in the "additive 2" field. This indicates that the material to which the product number "101a001b001" is assigned is a material that includes a main component assigned with the identification information "101", an additive assigned with the identification information "a001", and an additive assigned with the identification information "b001".
また、材料パラメータ情報111の各材料パラメータには、例えば0を基準とした標準化された値が登録されている。標準化は、各材料パラメータの値を、各材料パラメータのデータベース全体の平均値を引き、標準偏差で割ることであり、平均が0、分散が1の標準偏差とする操作である。このように、各材料は、五感パラメータ(材料パラメータ)によって数値化されている。図4の例によると、製品番号「101a001b001」の五感パラメータ「硬さ」の値は「0.12」であり、五感パラメータ「温冷」の値は「0.03」である。
Furthermore, for each material parameter in the material parameter information 111, a standardized value based on, for example, 0 is registered. Standardization is an operation in which the value of each material parameter is subtracted from the average value of the entire database of each material parameter and divided by the standard deviation, resulting in a standard deviation with an average of 0 and a variance of 1. In this way, each material is quantified by the five sense parameters (material parameters). In the example of Figure 4, the value of the five sense parameter "hardness" for product number "101a001b001" is "0.12", and the value of the five sense parameter "warm/cold" is "0.03".
なお、製品種別によっては、五感のうち一部の感覚(例えば、聴覚や味覚)が印象に影響を及ぼさない材料もある。例えば、味覚は、食品用包装材に対しては僅かに印象に影響を与えることがあるが、住宅用建材及び鞄用合皮材に対しては印象に影響を与えることがない。
Depending on the product type, there are materials that are not affected by some of the five senses (such as hearing or taste). For example, taste may slightly affect the impression of food packaging materials, but has no effect on the impression of building materials for homes or synthetic leather materials for bags.
情動言葉パラメータ抽出部132は、情動言葉入力部122から出力された情動言葉を取得する。図2に一例を表したとおり、情報言葉は人の情動を表す語句である。また、情動言葉パラメータ抽出部132は、パラメータ情報記憶部110に記憶された情動言葉パラメータ情報112を参照する。情動言葉パラメータ抽出部132は、取得された情動言葉に対応付けられた情動言葉パラメータの値を情動言葉パラメータ情報112から抽出する。
The emotion word parameter extraction unit 132 acquires emotion words output from the emotion word input unit 122. As shown in an example in FIG. 2, information words are words that express human emotions. The emotion word parameter extraction unit 132 also refers to emotion word parameter information 112 stored in the parameter information storage unit 110. The emotion word parameter extraction unit 132 extracts the value of an emotion word parameter associated with the acquired emotion word from the emotion word parameter information 112.
情動言葉パラメータ情報112は、情動言葉と五感パラメータとの関連性の度合いを示す情報である。また、ここでいう五感パラメータとは、前述の通り、例えば図3に示されるような五感の各々が更に細分化された感覚を示す項目によって構成される。
Emotion word parameter information 112 is information that indicates the degree of association between emotion words and five sense parameters. As mentioned above, the five sense parameters are composed of items that indicate sensations that are further subdivided into each of the five senses, for example, as shown in FIG. 3.
図5は、本開示の一実施形態における材料推奨装置100のパラメータ情報記憶部110に記憶された情動言葉パラメータ情報112の一例を示す図である。情動言葉パラメータ情報112は、情動言葉と前述の五感パラメータとの関連性を示すデータである。以下、情動言葉に対応付けられた五感パラメータを「情動言葉パラメータ」という。
FIG. 5 is a diagram showing an example of emotion word parameter information 112 stored in the parameter information storage unit 110 of the material recommendation device 100 in one embodiment of the present disclosure. The emotion word parameter information 112 is data indicating the association between emotion words and the five sense parameters described above. Hereinafter, the five sense parameters associated with emotion words are referred to as "emotion word parameters."
図5に示される情動言葉パラメータ情報112では、情動言葉と、情動言葉パラメータとが互いに対応付けられている。情動言葉パラメータ情報112の各行が、それぞれ1つの情動言葉に関する情報である。図5の例によると、情動言葉「美しい」の、五感パラメータ「質感1」の値は「0.73」であり、五感パラメータ「質感2」の値は「-0.15」である。
In the emotion word parameter information 112 shown in FIG. 5, emotion words and emotion word parameters are associated with each other. Each row of the emotion word parameter information 112 is information about one emotion word. In the example of FIG. 5, the value of the five sense parameter "texture 1" for the emotion word "beautiful" is "0.73", and the value of the five sense parameter "texture 2" is "-0.15".
このように、それぞれの情動言葉は、五感パラメータ(情動言葉パラメータ)によって予め数値化されている。なお、数値化する方法としては、例えば、各材料について情動言葉を用いた官能評価を複数の被験者に対して予め行い、情動言葉と五感パラメータとの回帰モデルを作成して数値を決定する方法が考えられる。また、これらの情動言葉パラメータは、材料パラメータと同様に情動言葉データベース全体を用いた標準化した値を用いても構わない。
In this way, each emotion word is quantified in advance using five sense parameters (emotion word parameters). One possible method for quantifying the emotion words is to perform a sensory evaluation of each ingredient using emotion words on multiple subjects in advance, and then create a regression model between the emotion words and the five sense parameters to determine the numerical value. Furthermore, as with the ingredient parameters, these emotion word parameters may use standardized values using the entire emotion word database.
情動言葉パラメータ情報112の各情動言葉パラメータには、図4に示される材料パラメータの値と同様に、0を基準とした標準化された値が登録されている。標準化は、各情動言葉パラメータの値を、各情動言葉パラメータのデータベース全体の平均値を引き、標準偏差で割ることであり、平均が0、分散が1の標準偏差とする操作である。このように、各情動言葉は、五感パラメータ(情動言葉パラメータ)によって数値化されている。
For each emotion word parameter in the emotion word parameter information 112, a standardized value based on 0 is registered, similar to the values of the material parameters shown in FIG. 4. Standardization is an operation in which the value of each emotion word parameter is subtracted from the average value of the entire database of emotion word parameters and divided by the standard deviation, resulting in a standard deviation with an average of 0 and a variance of 1. In this way, each emotion word is quantified using the five sense parameters (emotion word parameters).
モーダル補正部133は、使用状況入力部123から出力された使用状況情報を取得する。前述のとおり、使用状況情報は、利用者が五感のうちどの感覚にしたがって、どのように印象を受ける状況であるかを特定する情報である。また、モーダル補正部133は、パラメータ情報記憶部110に記憶されたモーダル補正情報113を参照する。モーダル補正部133は、取得された使用状況情報と、モーダル補正情報113とに基づいて、材料パラメータ抽出部131によって抽出された材料パラメータの値と、情動言葉パラメータ抽出部132によって抽出された情動言葉パラメータの値とを、それぞれ補正する。
The modal correction unit 133 acquires the usage information output from the usage input unit 123. As described above, the usage information is information that specifies the situation in which the user is impressed and according to which of the five senses. In addition, the modal correction unit 133 refers to the modal correction information 113 stored in the parameter information storage unit 110. The modal correction unit 133 corrects the values of the material parameters extracted by the material parameter extraction unit 131 and the values of the emotion word parameters extracted by the emotion word parameter extraction unit 132 based on the acquired usage information and the modal correction information 113.
具体的には、例えば、モーダル補正部133は、使用状況情報に基づいて、利用者が五感のうち複数の感覚によって印象を受ける場合に、どのような順序で各感覚による印象を受ける状況であるかを特定する。モーダル補正部133は、特定された使用状況において生じるクロスモーダル効果を考慮するための係数群を、モーダル補正情報113から抽出する。モーダル補正部133は、抽出された係数群を、材料パラメータ抽出部131によって抽出された材料パラメータの値と、情動言葉パラメータ抽出部132によって抽出された情動言葉パラメータの値とに、それぞれ掛け合わせる。
Specifically, for example, when a user receives impressions through multiple of the five senses, the modal correction unit 133 identifies, based on the usage situation information, the order in which the impressions are received through each sense. The modal correction unit 133 extracts, from the modal correction information 113, a set of coefficients for taking into account the cross-modal effect occurring in the identified usage situation. The modal correction unit 133 multiplies the extracted set of coefficients by the values of the material parameters extracted by the material parameter extraction unit 131 and the values of the emotion word parameters extracted by the emotion word parameter extraction unit 132, respectively.
一般的に、クロスモーダル効果とは、例えば視覚と触覚のように、五感のうち2つの感覚でそれぞれ受ける印象が互いに関係することをいう。また、一般的に、マルチモーダル効果とは、視覚と触覚と聴覚のように、五感のうち複数種類の感覚でそれぞれ印象が互いに関係することをいう。但し、本実施形態でいうクロスモーダル効果には、マルチモーダル効果の意味も含まれているものとする。
Generally, a cross-modal effect refers to a relationship between impressions received through two of the five senses, such as vision and touch. Generally, a multi-modal effect refers to a relationship between impressions received through multiple senses, such as vision, touch, and hearing. However, in this embodiment, the cross-modal effect also includes the meaning of the multi-modal effect.
例えば、利用者が鞄を認識する場合、目視で鞄をよく見た後に鞄に触って感触を確かめることがある一方で、逆に、鞄に触って感触を確かめた後にその鞄を目視で鞄をよく見ることもある。そして、視覚と触覚のどちらの感覚による印象を先に受けたかによって、利用者が材料から受ける総合的な印象は異なってくる。例えば鞄用合皮材のような材料は、このように複数の感覚がどのような順序で働いたかによって異なる印象を利用者に与えるというクロスモーダル効果が発生する。
For example, when a user recognizes a bag, they may look at it visually and then touch it to check its texture, or conversely, they may touch it to check its texture and then look at it visually. The overall impression that the user gets from a material will differ depending on whether the impression is received first by the sense of sight or touch. For example, materials such as synthetic leather for bags give the user different impressions depending on the order in which multiple senses are activated, creating a cross-modal effect.
本実施形態における材料推奨装置100は、取得された使用状況情報に基づく使用状況に応じてクロスモーダル効果を考慮する。具体的には、前述の通り、モーダル補正部133が、クロスモーダル効果を考慮するための係数群を、材料パラメータ抽出部131によって抽出された材料パラメータの値と、情動言葉パラメータ抽出部132によって抽出された情動言葉パラメータの値とに、それぞれ掛け合わせる。これにより、材料パラメータの値及び情動言葉パラメータの値が、使用状況情報に応じたクロスモーダル効果による影響が反映された値に補正される。
The material recommendation device 100 in this embodiment takes into account the cross-modal effect according to the usage situation based on the acquired usage situation information. Specifically, as described above, the modal correction unit 133 multiplies the values of the material parameters extracted by the material parameter extraction unit 131 and the values of the emotion word parameters extracted by the emotion word parameter extraction unit 132 by a set of coefficients for taking the cross-modal effect into account. As a result, the values of the material parameters and the emotion word parameters are corrected to values that reflect the influence of the cross-modal effect according to the usage situation information.
なお、本実施形態においては、材料パラメータの値に掛け合わされる係数群を含むモーダル補正情報113(例えば図6に示されるモーダル補正情報113)と、情動言葉パラメータの値に掛け合わされる係数群を含むモーダル補正情報113(例えば図7に示されるモーダル補正情報113)とが別々に、パラメータ情報記憶部110に予め記憶されている。
In this embodiment, modal correction information 113 including a group of coefficients to be multiplied by the material parameter values (for example, modal correction information 113 shown in FIG. 6) and modal correction information 113 including a group of coefficients to be multiplied by the emotional word parameter values (for example, modal correction information 113 shown in FIG. 7) are pre-stored separately in the parameter information storage unit 110.
図6及び図7は、本開示の一実施形態における材料推奨装置100のパラメータ情報記憶部110に記憶されたモーダル補正情報113の一例を示す図である。
FIGS. 6 and 7 are diagrams showing an example of modal correction information 113 stored in the parameter information storage unit 110 of the material recommendation device 100 in one embodiment of the present disclosure.
図6には、材料パラメータの値に掛け合わされる係数群を含むモーダル補正情報113の一例が示されている。図6は、材料パラメータの値の補正に用いられる、使用状況情報それぞれに応じた係数群を表す。図6に示されるように、材料パラメータの値に掛け合わされる係数の場合、例えば、クロスモーダルにおいて先に受ける感覚の係数値には1が予め設定され、後に受ける感覚の係数値には0より大きく1未満の数値が予め設定され、それ以外の感覚の係数値には0が予め設定されている。
Figure 6 shows an example of modal correction information 113 including a group of coefficients by which the material parameter values are multiplied. Figure 6 shows a group of coefficients corresponding to each piece of usage information, which are used to correct the material parameter values. As shown in Figure 6, in the case of coefficients by which the material parameter values are multiplied, for example, in cross-modal, the coefficient value of the sensation received first is preset to 1, the coefficient value of the sensation received later is preset to a value greater than 0 and less than 1, and the coefficient values of the other sensations are preset to 0.
なお、このような係数を数値化する方法としては、前述の情動言葉を五感パラメータによって予め数値化する場合と同様に、例えば、各使用状況について各材料を用いた官能評価を複数の被験者に対して予め行い、使用状況と材料との回帰モデルを作成して数値を決定する方法が考えられる。
As a method for quantifying such coefficients, similar to the above-mentioned case where emotional words are quantified in advance using the five sense parameters, a method can be considered in which a sensory evaluation is carried out in advance on multiple subjects using each material for each usage situation, and a regression model between the usage situation and the material is created to determine the numerical value.
図7には、情動言葉パラメータの値に掛け合わされる係数群を含むモーダル補正情報113の一例が示されている。図7は、情動言葉パラメータの値の補正に用いられる、使用状況情報それぞれに応じた係数群を表す。図7に示されるように、情動言葉パラメータの値に掛け合わされる係数の場合、例えば、クロスモーダル効果が生じる感覚の係数値には1が予め設定されており、それ以外の感覚の係数値には0が予め設定されている。
FIG. 7 shows an example of modal correction information 113 including a group of coefficients by which the value of the emotion word parameter is multiplied. FIG. 7 shows a group of coefficients corresponding to each piece of usage information, which are used to correct the value of the emotion word parameter. As shown in FIG. 7, in the case of coefficients by which the value of the emotion word parameter is multiplied, for example, the coefficient value of the sensation that produces a cross-modal effect is preset to 1, and the coefficient values of the other sensations are preset to 0.
材料パラメータ抽出部131は、モーダル補正部133によって補正された材料パラメータを示す情報を、推奨材料決定部134へ出力する。また、情動言葉パラメータ抽出部132は、モーダル補正部133によって補正された情動言葉パラメータを示す情報を、推奨材料決定部134へ出力する。
The material parameter extraction unit 131 outputs information indicating the material parameters corrected by the modal correction unit 133 to the recommended material determination unit 134. In addition, the emotion word parameter extraction unit 132 outputs information indicating the emotion word parameters corrected by the modal correction unit 133 to the recommended material determination unit 134.
推奨材料決定部134は、材料パラメータ抽出部131から出力された材料パラメータを示す情報と、情動言葉パラメータ抽出部132から出力された情動言葉パラメータを示す情報とをそれぞれ取得する。推奨材料決定部134は、取得された情報に基づいて、材料パラメータの値と情動言葉パラメータの値との座標間の距離を材料ごとに算出する。推奨材料決定部134は、算出された距離に基づいて推奨する材料として決定する。例えば、推奨材料決定部134は、算出された距離が最も短い材料を、あるいは、算出された距離が短い方から順に所定の個数の材料を、推奨する材料として決定する。
The recommended material determination unit 134 acquires information indicating the material parameters output from the material parameter extraction unit 131 and information indicating the emotion word parameters output from the emotion word parameter extraction unit 132. Based on the acquired information, the recommended material determination unit 134 calculates the distance between the coordinates of the material parameter value and the emotion word parameter value for each material. The recommended material determination unit 134 determines the material to be recommended based on the calculated distance. For example, the recommended material determination unit 134 determines the material with the shortest calculated distance, or a predetermined number of materials in order of shortest calculated distance, as the recommended materials.
推奨材料決定部134は、推奨する材料を示す情報を出力部140へ出力する。なお、推奨材料決定部134は、推奨する各材料について、材料パラメータの値と情動言葉パラメータの値との間の距離を示す情報も併せて出力するようにしてもよい。
The recommended material determination unit 134 outputs information indicating the recommended material to the output unit 140. Note that the recommended material determination unit 134 may also output information indicating the distance between the material parameter value and the emotional word parameter value for each recommended material.
なお、ここでは、情動言葉に合致する材料を選ぶにあたって、材料パラメータの値と情動パラメータの値との間の距離を単純に比較するという方法を用いたが、別の方法として、例えば、ベイズ最適化のような逆問題を解く方法が用いられてもよいし、グリッドサーチを行って誤差が最小化する順問題を解く方法が用いられてもよい。なお、DataRobot(登録商標)のようなAutoML(Automated Machine Learning)を用いて、上記の作業を代行するような構成にすることも可能である。
Here, a method of simply comparing the distance between the material parameter value and the emotion parameter value is used to select materials that match the emotion words, but other methods, such as a method of solving an inverse problem such as Bayesian optimization, or a method of solving a forward problem that minimizes error by performing a grid search, may be used. It is also possible to configure a system to perform the above tasks by using AutoML (Automated Machine Learning) such as DataRobot (registered trademark).
出力部140は、利用者に対して情報を提示する出力装置を用いて構成される。ここでいう出力装置とは、例えば、液晶ディスプレイ(LCD:Liquid Crystal Display)、CRT(Cathode Ray Tube)、又は有機EL(Electro-Luminescence)ディスプレイ等の表示装置である。なお、出力部140は、利用者に対して情報を提示する外部の装置へ当該情報を送信する通信インターフェース等であってもよい。外部の装置とは、例えば、汎用コンピュータ等の情報処理装置、各種ディスプレイ、又はプリンタ等である。
The output unit 140 is configured using an output device that presents information to the user. The output device here is, for example, a display device such as a liquid crystal display (LCD), a cathode ray tube (CRT), or an organic electro-luminescence (EL) display. The output unit 140 may also be a communication interface that transmits information to an external device that presents the information to the user. The external device is, for example, an information processing device such as a general-purpose computer, various displays, or a printer.
出力部140は、推奨材料決定部134から出力された、推奨する材料を示す情報を取得する。出力部140は、推奨する材料を示す情報を利用者に提示する。
The output unit 140 acquires the information indicating the recommended materials output from the recommended material determination unit 134. The output unit 140 presents the information indicating the recommended materials to the user.
[材料推奨装置の動作]
以下、本開示の材料推奨装置100の動作の一例について、具体例を交えながら説明する。図8は、本開示の一実施形態における材料推奨装置100の動作を示すフローチャートである。 [Operation of the material recommendation device]
An example of the operation of thematerial recommendation device 100 according to the present disclosure will be described below with reference to a specific example. Fig. 8 is a flowchart showing the operation of the material recommendation device 100 according to an embodiment of the present disclosure.
以下、本開示の材料推奨装置100の動作の一例について、具体例を交えながら説明する。図8は、本開示の一実施形態における材料推奨装置100の動作を示すフローチャートである。 [Operation of the material recommendation device]
An example of the operation of the
まず、製品種別入力部121は、利用者による製品種別情報の入力を受け付ける。例えばディスプレイ等の表示装置に、図4に示される材料パラメータ情報111の一番左側の列である「製品種別」の項目の値のリスト(「食品用包装材」、「住宅用建材」、「鞄用合皮材」、・・・)が表示される。利用者は、表示されたリストの中から所望の製品種別を、例えばマウス等の入力装置によって選択する入力操作を行う。なお、ここでは一例として、利用者は「鞄用合皮材」の製品種別を選択したものとする。
First, the product type input unit 121 accepts input of product type information by the user. For example, a list of values for the "product type" item, which is the leftmost column of the material parameter information 111 shown in FIG. 4, ("food packaging material," "residential building material," "synthetic leather for bags," ...) is displayed on a display device such as a display. The user performs an input operation to select the desired product type from the displayed list using an input device such as a mouse. Note that, as an example, it is assumed here that the user has selected the product type of "synthetic leather for bags."
製品種別入力部121は、「鞄用合皮材」を示す製品種別情報を材料パラメータ抽出部131へ出力する。材料パラメータ抽出部131は、製品種別入力部121から出力された製品種別情報を取得する(ステップS01)。
The product type input unit 121 outputs product type information indicating "synthetic leather material for bags" to the material parameter extraction unit 131. The material parameter extraction unit 131 acquires the product type information output from the product type input unit 121 (step S01).
次に、情動言葉入力部122は、利用者による情動言葉の入力を受け付ける。例えばディスプレイ等の表示装置に、図2に示される情動言葉のリスト(「好きな」、「派手な」、「愛らしい」、・・・)が表示される。利用者は、表示されたリストの中から所望の情動言葉を、例えばマウス等の入力装置によって選択する入力操作を行う。なお、ここでは一例として、利用者は、「美しい」の情動言葉を選択したものとする。なお、この例に限定されるものではなく、複数の情動言葉が選択されてもよい。この場合、情動言葉それぞれについて下記の処理が実行される。
Next, the emotion word input unit 122 accepts the input of emotion words by the user. For example, a list of emotion words as shown in FIG. 2 ("favorite," "flashy," "adorable," ...) is displayed on a display device such as a display. The user performs an input operation to select a desired emotion word from the displayed list using an input device such as a mouse. Note that, as an example, the user has selected the emotion word "beautiful." Note that this is not a limitation to this example, and multiple emotion words may be selected. In this case, the following process is executed for each emotion word.
情動言葉入力部122は、「美しい」という情動言葉を情動言葉パラメータ抽出部132へ出力する。情動言葉パラメータ抽出部132は、情動言葉入力部122から出力された情動言葉を取得する(ステップS02)。
The emotion word input unit 122 outputs the emotion word "beautiful" to the emotion word parameter extraction unit 132. The emotion word parameter extraction unit 132 acquires the emotion word output from the emotion word input unit 122 (step S02).
次に、使用状況入力部123は、利用者による使用状況情報の入力を受け付ける。例えばディスプレイ等の表示装置に、図6又は図7に示されるモーダル補正情報113の左側から2番目までの列である、「クロスモーダル効果」の「先」及び「後」の項目の値のリスト(「触覚」「なし」、「視覚」「なし」、「聴覚」「なし」、・・・)が表示される。利用者は、自身が製品を使用する際の使用状況を想定し、表示されたリストの中から所望の使用状況情報を、例えばマウス等の入力装置によって選択する入力操作を行う。なお、ここでは一例として、利用者は、先に受ける感覚が「視覚」であり、後に受ける感覚が「触覚」である使用状況情報を選択したものとする。
Next, the usage status input unit 123 accepts the input of usage status information by the user. For example, a list of values for the "first" and "last" items of "cross-modal effect" ("tactile", "none", "visual", "none", "auditory", "none", ...) in the second column from the left of the modal correction information 113 shown in FIG. 6 or 7 is displayed on a display device such as a display. The user imagines the usage situation when using the product and performs an input operation to select the desired usage status information from the displayed list using an input device such as a mouse. Note that, as an example, it is assumed here that the user selects usage status information in which the first sensation received is "visual" and the second sensation received is "tactile".
使用状況入力部123は、先に受ける感覚が「視覚」であり後に受ける感覚が「触覚」であることを示す使用状況情報をモーダル補正部133へ出力する。モーダル補正部133は、使用状況入力部123から出力された使用状況情報を取得する(ステップS03)。
The usage status input unit 123 outputs usage status information indicating that the first sensation is "visual" and the second sensation is "tactile" to the modal correction unit 133. The modal correction unit 133 acquires the usage status information output from the usage status input unit 123 (step S03).
なお、ディスプレイ等の表示装置に、例えば、「鞄を見ただけで購入するかどうかを決める」、「鞄を見た後に触り心地を確認して購入するかどうかを決める」といったような文言からなる選択肢のリストが表示され、利用者がそのリストの中から選択するようにしてもよい。この場合、例えばパラメータ情報記憶部110は、これらの選択肢と一又は複数の感覚とが対応付けられた情報を予め記憶している。すなわち、例えば、「鞄を見ただけで購入するかどうかを決める」と「視覚」とが対応付けられ、「鞄を見た後に触り心地を確認して購入するかどうかを決める」と「視覚」及び「触覚」とが対応付けられた情報が予め記憶されている。
Note that a list of options, such as "Decide whether to purchase just by looking at the bag" or "Decide whether to purchase after checking the feel of the bag after looking at it" may be displayed on a display device such as a display, and the user may select from the list. In this case, for example, the parameter information storage unit 110 stores information in advance that associates these options with one or more senses. That is, for example, information in advance is stored in which "Decide whether to purchase just by looking at the bag" is associated with "sight," and "Decide whether to purchase after checking the feel of the bag after looking at it" is associated with "sight" and "touch."
そして、モーダル補正部133は、例えば、「鞄を見た後に触り心地を確認して購入するかどうかを決める」という選択肢が利用者によって選択されたことを示す情報を使用状況入力部123から取得した場合、パラメータ情報記憶部110に予め記憶された上記の情報を参照することで、先に受ける感覚が「視覚」であり後に受ける感覚が「触覚」である使用状況情報が選択されたものであることを特定することができる。
Then, when the modal correction unit 133 obtains information from the usage status input unit 123 indicating that the option "After looking at the bag, check the feel to decide whether to purchase it or not" has been selected by the user, the modal correction unit 133 can determine by referring to the above information stored in advance in the parameter information storage unit 110 that the usage status information selected is one in which the first sensation received is "sight" and the second sensation received is "touch."
次に、材料パラメータ抽出部131は、取得された製品種別情報に対応する製品種別に対応付けられた材料パラメータの値を、パラメータ情報記憶部110に予め記憶された材料パラメータ情報111から抽出する(ステップS04)。例えば、ここでは、利用者によって「鞄用合皮材」の製品種別が選択されていることから、材料パラメータ抽出部131は、図4に示される材料パラメータ情報111の「製品種別」の項目の値が「鞄用合皮材」である行のデータ(材料パラメータの値)を全て抽出する。
Then, the material parameter extraction unit 131 extracts the values of the material parameters associated with the product type corresponding to the acquired product type information from the material parameter information 111 pre-stored in the parameter information storage unit 110 (step S04). For example, since the product type "synthetic leather for bags" has been selected by the user here, the material parameter extraction unit 131 extracts all the data (values of the material parameters) in the rows in which the value of the "product type" item in the material parameter information 111 shown in FIG. 4 is "synthetic leather for bags".
なお、図4に示される材料パラメータ情報111において、「製品種別」の項目の値が「鞄用合皮材」である行は5行ある(m=5)。すなわち、鞄用合皮材の材料として用いられる材料は全部で5つ登録されている(製品番号「301a201b101」~「305a201b105」)。各材料に対応付けられた材料パラメータ(感覚パラメータ)の個数がn個であるとすると、材料パラメータ抽出部131は、1行n列からなる行列(以下、「行列A」という。)をm個(5個)抽出することになる。
In the material parameter information 111 shown in FIG. 4, there are five rows in which the value of the "Product Type" item is "Synthetic Leather for Bags" (m=5). That is, a total of five materials used as synthetic leather for bags are registered (product numbers "301a201b101" to "305a201b105"). If the number of material parameters (sensory parameters) associated with each material is n, then the material parameter extraction unit 131 will extract m (5) matrices consisting of 1 row and n columns (hereinafter referred to as "Matrix A").
次に、モーダル補正部133は、取得された使用状況情報と、モーダル補正情報113とに基づいて、材料パラメータ抽出部131によって抽出された材料パラメータの値を補正する(ステップS05)。
Next, the modal correction unit 133 corrects the values of the material parameters extracted by the material parameter extraction unit 131 based on the acquired usage information and the modal correction information 113 (step S05).
例えば、ここでは、利用者によって先に受ける感覚が「視覚」であり後に受ける感覚が「触覚」である使用状況情報が選択されている。このため、モーダル補正部133は、図6に示されるモーダル補正情報113の「クロスモーダル効果」の項目の「先」の項目の値が「視覚」であり「後」の項目の値が「触覚」である行のデータ(係数群)を抽出する。各材料に対応付けられた材料パラメータの個数がn個であるとすると、モーダル補正部133は、1行n列からなる行列を1つ抽出することになる。そして、モーダル補正部133は、抽出された行列を対角成分としたn行n列からなる対角行列(以下、「対角行列B」という。)を生成する。
For example, here, the usage status information selected by the user is one in which the first sensation is "visual" and the second sensation is "tactile". Therefore, the modal correction unit 133 extracts data (a group of coefficients) from a row in which the value of the "first" item in the "cross-modal effect" item of the modal correction information 113 shown in FIG. 6 is "visual" and the value of the "last" item is "tactile". If the number of material parameters associated with each material is n, the modal correction unit 133 extracts one matrix consisting of 1 row and n columns. The modal correction unit 133 then generates a diagonal matrix consisting of n rows and n columns (hereinafter referred to as "diagonal matrix B") with the extracted matrix as its diagonal components.
モーダル補正部133は、抽出された行列Aの各々と、生成された対角行列Bとを、それぞれ掛け合わせることにより、材料パラメータの値を補正する。これにより、材料パラメータの値が、選択された使用状況情報に応じたクロスモーダル効果が考慮された値に補正される。
The modal correction unit 133 corrects the values of the material parameters by multiplying each of the extracted matrices A by the generated diagonal matrix B. This causes the values of the material parameters to be corrected to values that take into account the cross-modal effects according to the selected usage information.
図9は、本開示の一実施形態における材料推奨装置100のモーダル補正部133によって行われる材料パラメータの値の補正処理の一例を示す図である。図9には、一例として、「301a201b101」という製品番号が付与された鞄用合皮材の材料パラメータの値を補正する際の行列計算が示されている。図9に示されるように、図4に示される材料パラメータ情報111から抽出された材料パラメータの値からなる行列Aと、図6に示されるモーダル補正情報113から抽出された係数群から生成された対角行列Bとが、掛け合わされることにより、補正された材料パラメータの値からなる行列が算出される(「-0.09」、「0.71」、「0.69」、「0.26」、・・・)。
FIG. 9 is a diagram showing an example of a material parameter value correction process performed by the modal correction unit 133 of the material recommendation device 100 in one embodiment of the present disclosure. As an example, FIG. 9 shows a matrix calculation when correcting the material parameter values of a synthetic leather material for a bag that is assigned the product number "301a201b101". As shown in FIG. 9, a matrix consisting of corrected material parameter values is calculated by multiplying matrix A consisting of material parameter values extracted from the material parameter information 111 shown in FIG. 4 by diagonal matrix B generated from a group of coefficients extracted from the modal correction information 113 shown in FIG. 6 ("-0.09", "0.71", "0.69", "0.26", ...).
次に、情動言葉パラメータ抽出部132は、取得された情動言葉に対応付けられた情動言葉パラメータの値を、パラメータ情報記憶部110に予め記憶された情動言葉パラメータ情報112から抽出する(ステップS06)。例えば、ここでは、利用者によって「美しい」という情動言葉が選択されていることから、情動言葉パラメータ抽出部132は、図5に示される情動言葉パラメータ情報112の「情動言葉」の項目の値が「美しい」である行のデータ(情動言葉パラメータの値)を抽出する。
Next, the emotion word parameter extraction unit 132 extracts the value of the emotion word parameter associated with the acquired emotion word from the emotion word parameter information 112 pre-stored in the parameter information storage unit 110 (step S06). For example, since the emotion word "beautiful" has been selected by the user here, the emotion word parameter extraction unit 132 extracts data (emotion word parameter value) from the row in which the value of the "emotion word" item in the emotion word parameter information 112 shown in FIG. 5 is "beautiful."
各情動言葉に対応付けられた情動言葉パラメータ(感覚パラメータ)の個数がn個であるとすると、情動言葉パラメータ抽出部132は、1行n列からなる行列(以下、「行列C」という。)をm個抽出することになる。なお、ここでいうmの値は、利用者によって選択された情動言葉個数を表す。なお、ここでは、一例として、利用者によって1つの情動言葉(「美しい」)のみが選択されていることから、情動言葉パラメータ抽出部132は、行列Cを1個抽出することになる。
If the number of emotion word parameters (sensory parameters) associated with each emotion word is n, then the emotion word parameter extraction unit 132 extracts m matrices (hereinafter referred to as "matrix C") consisting of one row and n columns. The value of m here represents the number of emotion words selected by the user. Here, as an example, since only one emotion word ("beautiful") has been selected by the user, the emotion word parameter extraction unit 132 extracts one matrix C.
次に、モーダル補正部133は、取得された使用状況情報と、モーダル補正情報113とに基づいて、情動言葉パラメータ抽出部132によって抽出された情動言葉パラメータの値を補正する(ステップS07)。
Next, the modal correction unit 133 corrects the values of the emotion word parameters extracted by the emotion word parameter extraction unit 132 based on the acquired usage information and the modal correction information 113 (step S07).
例えば、ここでは、利用者によって先に受ける感覚が「視覚」であり後に受ける感覚が「触覚」である使用状況情報が選択されている。このため、モーダル補正部133は、図7に示されるモーダル補正情報113の「クロスモーダル効果」の項目の「先」の項目の値が「視覚」であり「後」の項目の値が「触覚」である行のデータ(係数群)を抽出する。各情動言葉に対応付けられた情動言葉パラメータの個数がn個であるとすると、モーダル補正部133は、1行n列からなる行列を1つ抽出することになる。そして、モーダル補正部133は、抽出された行列を対角成分としたn行n列からなる対角行列(以下、「対角行列D」という。)を生成する。
For example, here, the usage information selected by the user is one in which the first sensation is "visual" and the second sensation is "tactile". For this reason, the modal correction unit 133 extracts data (a group of coefficients) from rows in which the value of the "first" item in the "cross-modal effect" item of the modal correction information 113 shown in FIG. 7 is "visual" and the value of the "last" item is "tactile". If the number of emotion word parameters associated with each emotion word is n, the modal correction unit 133 extracts one matrix consisting of 1 row and n columns. The modal correction unit 133 then generates a diagonal matrix consisting of n rows and n columns (hereinafter referred to as "diagonal matrix D") with the extracted matrix as its diagonal components.
モーダル補正部133は、抽出された行列Cの各々(ここでは1つ)と、生成された対角行列Dとを、それぞれ掛け合わせることにより、情動言葉パラメータの値を補正する。これにより、情動言葉パラメータの値が、クロスモーダル効果が考慮された値に補正される。
The modal correction unit 133 corrects the values of the emotion word parameters by multiplying each of the extracted matrices C (here, one) by the generated diagonal matrix D. In this way, the emotion word parameter values are corrected to values that take into account the cross-modal effect.
図10は、本開示の一実施形態における材料推奨装置100のモーダル補正部133によって行われる情動言葉パラメータの値の補正処理の一例を示す図である。図10には、「美しい」という情動言葉に対応する情動言葉パラメータの値を補正する際の行列計算が示されている。図10に示されるように、図5に示される材料パラメータ情報111から抽出された情動言葉パラメータの値からなる行列Cと、図7に示されるモーダル補正情報113から抽出された係数群から生成された対角行列Dとが、掛け合わされることにより、補正された情動言葉パラメータの値からなる行列が算出される(「0.03」、「-0.81」、「-0.62」、「0.03」、・・・)。
FIG. 10 is a diagram showing an example of the correction process of the emotion word parameter values performed by the modal correction unit 133 of the material recommendation device 100 in one embodiment of the present disclosure. FIG. 10 shows a matrix calculation when correcting the emotion word parameter value corresponding to the emotion word "beautiful". As shown in FIG. 10, a matrix C consisting of the emotion word parameter values extracted from the material parameter information 111 shown in FIG. 5 is multiplied by a diagonal matrix D generated from a group of coefficients extracted from the modal correction information 113 shown in FIG. 7 to calculate a matrix consisting of the corrected emotion word parameter values ("0.03", "-0.81", "-0.62", "0.03", ...).
材料パラメータ抽出部131は、補正された材料パラメータの値を推奨材料決定部134へ出力する。また、情動言葉パラメータ抽出部132は、補正された情動言葉パラメータの値を推奨材料決定部134へ出力する。
The material parameter extraction unit 131 outputs the corrected material parameter values to the recommended material determination unit 134. In addition, the emotion word parameter extraction unit 132 outputs the corrected emotion word parameter values to the recommended material determination unit 134.
次に、推奨材料決定部134は、材料パラメータ抽出部131から出力された補正後の材料パラメータの値と、情動言葉パラメータ抽出部132から出力された補正後の情動言葉パラメータの値とを取得する。推奨材料決定部134は、取得された材料パラメータの行列と情動言葉パラメータの行列を座標として捉えその座標間の距離を材料ごとに算出する。推奨材料決定部134は、算出された距離に基づいて推奨する材料を決定する(ステップS08)。たとえば、複数の材料から、利用者に指定された情動言葉に対応する情動言葉パラメータの座標と材料に対応する材料パラメータの座標間の距離の近い材料が、情動言葉と関係性が強いと捉え、順に特定することができる。
Next, the recommended material determination unit 134 obtains the corrected material parameter values output from the material parameter extraction unit 131 and the corrected emotion word parameter values output from the emotion word parameter extraction unit 132. The recommended material determination unit 134 regards the obtained material parameter matrix and emotion word parameter matrix as coordinates, and calculates the distance between the coordinates for each material. The recommended material determination unit 134 determines the material to be recommended based on the calculated distance (step S08). For example, from among multiple materials, materials with a short distance between the coordinates of the emotion word parameter corresponding to the emotion word specified by the user and the coordinates of the material parameter corresponding to the material can be regarded as having a strong relationship with the emotion word, and can be identified in order.
図11は、本開示の一実施形態における材料推奨装置100推奨材料決定部134によって行われる座標間の距離の算出処理の一例を示す図である。図11には、図10に示される補正処理によって算出された補正後の情動言葉パラメータの値からなる行列から、図9に示される補正処理によって算出された補正後の材料パラメータの値からなる行列の成分を減算し、算出された行列の成分の各値を二乗し、算出された値をさらに合算してルート計算を行うことで座標間の距離を算出する処理が示されている。
FIG. 11 is a diagram showing an example of a calculation process of the distance between coordinates performed by the recommended material determination unit 134 of the material recommendation device 100 in one embodiment of the present disclosure. FIG. 11 shows a process of subtracting the components of a matrix consisting of the corrected material parameter values calculated by the correction process shown in FIG. 9 from the matrix consisting of the corrected emotion word parameter values calculated by the correction process shown in FIG. 10, squaring each value of the calculated matrix components, and further adding up the calculated values to perform a root calculation to calculate the distance between coordinates.
例えば、図11において、「硬さ」の情動言葉パラメータの値は「0.03」であり、「硬さ」の材料パラメータの値は「-0.09」であるから、減算して二乗すると「0.0144」となる。推奨材料決定部134は、こうして五感パラメータごとに算出された成分値を合算し、さらにルート計算して算出された値を、情動言葉パラメータと材料パラメータとの座標間の距離とする。なお、ここでは、算出された座標間の距離は、例えば「4.0938+α」の平方根であることとする。ただし、座標間の距離の算出はこの例に限定されるものではなく、技術分野で知られる他の方法によって算出されてもよい。
For example, in FIG. 11, the value of the emotion word parameter for "hardness" is "0.03", and the value of the material parameter for "hardness" is "-0.09", so subtracting and squaring it results in "0.0144". The recommended material determination unit 134 adds up the component values calculated for each of the five sense parameters in this way, and further performs a root calculation to calculate the value, which is the distance between the coordinates of the emotion word parameter and the material parameter. Note that here, the calculated distance between the coordinates is, for example, the square root of "4.0938 + α". However, the calculation of the distance between the coordinates is not limited to this example, and may be calculated by other methods known in the technical field.
推奨材料決定部134は、材料ごと及び情動言葉ごとに上記の計算を行い、座標間の距離を算出する。推奨材料決定部134は、算出された座標間の距離に基づいて推奨する材料を決定する。推奨材料決定部134は、推奨する材料を示す情報と、当該材料について算出された座標間の距離を示す情報とを、出力部140へ出力する。
The recommended material determination unit 134 performs the above calculation for each material and each emotion word, and calculates the distance between the coordinates. The recommended material determination unit 134 determines the material to be recommended based on the calculated distance between the coordinates. The recommended material determination unit 134 outputs information indicating the recommended material and information indicating the calculated distance between the coordinates for the material to the output unit 140.
出力部140は、推奨材料決定部から出力された推奨する材料を示す情報と、当該材料について算出された座標間の距離を示す情報とを取得する。出力部140は、推奨する材料を示す情報と、当該材料について算出された座標間の距離を示す情報とを利用者に提示する(ステップS09)。
The output unit 140 acquires information indicating the recommended material output from the recommended material determination unit and information indicating the distance between the coordinates calculated for the material. The output unit 140 presents the information indicating the recommended material and information indicating the distance between the coordinates calculated for the material to the user (step S09).
図12は、本開示の一実施形態における材料推奨装置100の出力部140によって出力される情報の一例を示す図である。図12に示されるように、出力部140によって出力される情報には、例えば、選択された製品種別(「鞄用合皮材」)と、選択された情動言葉(「美しい」)と、材料ごとの(製品番号ごとの)座標間の距離の値とが含まれる。このとき、出力部140は、座標間の距離がより短い材料から順に出力することで、より推奨される材料が上位に位置するように表示させる。なお、図12に示されるように、座標間の距離が棒グラフでも表示されることにより、利用者は、より直感的に各材料の推奨度を認識することができる。ただし、この例に限定されるものではなく、製品番号の所定の順番に沿って座標間の距離が表示されてもよい。
12 is a diagram showing an example of information output by the output unit 140 of the material recommendation device 100 in one embodiment of the present disclosure. As shown in FIG. 12, the information output by the output unit 140 includes, for example, the selected product type ("artificial leather for bags"), the selected emotion word ("beautiful"), and the value of the distance between the coordinates for each material (for each product number). At this time, the output unit 140 outputs the materials in order of the shortest distance between the coordinates, so that the more recommended materials are displayed at the top. Note that, as shown in FIG. 12, the distance between the coordinates is also displayed as a bar graph, allowing the user to more intuitively recognize the recommendation level of each material. However, this is not limited to this example, and the distance between the coordinates may be displayed in a predetermined order of the product numbers.
なお、利用者が複数の情動言葉を選択した場合における、出力部140による推奨材料の提示方法の一例を以下に挙げる。図13は、本開示の一実施形態における材料推奨装置100の出力部140によって出力される情報のその他の例を示す図である。ここでは、利用者が、「派手な」という情動言葉と「男らしい」という情動言葉とを選択したものとする。
The following is an example of a method for presenting recommended materials by the output unit 140 when the user has selected multiple emotion words. Figure 13 is a diagram showing another example of information output by the output unit 140 of the material recommendation device 100 in one embodiment of the present disclosure. Here, it is assumed that the user has selected the emotion words "flashy" and "manly."
図13の例では、出力部140によって出力される情報には、例えば、選択された製品種別(「鞄用合皮材」)が含まれる。また、出力部140によって出力される情報には、例えば、選択された情動言葉(「派手な」及び「男らしい」)ごとに、材料ごとの(製品番号ごとの)座標間の距離の値が含まれる。さらに、出力部140によって出力される情報には、例えば、材料ごとの、複数の情動言葉における座標間の距離の平均値が含まれる。
In the example of FIG. 13, the information output by the output unit 140 includes, for example, the selected product type ("synthetic leather for bags"). The information output by the output unit 140 also includes, for example, the value of the distance between the coordinates for each material (for each product number) for each selected emotion word ("flashy" and "manly"). The information output by the output unit 140 also includes, for example, the average value of the distance between the coordinates for multiple emotion words for each material.
出力部140によってこのように情報が提示されることで、利用者は、例えば「派手な」という印象に最も近い鞄用合皮材は、座標間の距離が「8.7」である「303a201b102」という製品番号が付与された材料であることを認識することができる。また、利用者は、例えば「男らしい」という印象に最も近い鞄用合皮材は、座標間の距離が「8.3」である「303a201b104」という製品番号が付与された材料であることを認識することができる。また、利用者は、例えば「派手な」という印象及び「男らしい」という印象を平均的に高く合わせ持つ鞄用合皮材は、座標間の距離が「9.4」である「303a201b104」という製品番号が付与された材料であることを認識することができる。3つ以上の情動言葉が指定された場合、3つの情動言葉の平均値が出力されてもよいし、3つの情動言葉のうち2つの組合せごとに平均値が出力されてもよい。
By presenting information in this manner by the output unit 140, the user can recognize that the synthetic leather material for bags that is closest to the impression of "flashy" is the material assigned with the product number "303a201b102" with a coordinate distance of "8.7". The user can also recognize that the synthetic leather material for bags that is closest to the impression of "masculine" is the material assigned with the product number "303a201b104" with a coordinate distance of "8.3". The user can also recognize that the synthetic leather material for bags that has both the impression of "flashy" and the impression of "masculine" that are high on average is the material assigned with the product number "303a201b104" with a coordinate distance of "9.4". When three or more emotion words are specified, the average value of the three emotion words may be output, or the average value may be output for each combination of two of the three emotion words.
これにより、利用者は、例えば、全ての座標間の距離が最も短い材料を選択するようにしたり、選択した複数の情動言葉に対して平均的に最も近い印象を持つ材料を選択するようにしたりすることができる。
This allows the user to, for example, select the material with the shortest distance between all coordinates, or to select the material that has the closest impression on average to the multiple emotion words selected.
なお、例えば、出力部140は、材料パラメータの座標と情動言葉パラメータの座標とを視覚的に把握することができるマップを生成して提示するようにしてもよい。また、例えば、出力部140は、材料パラメータの座標と情動言葉パラメータの座標とについて主成分分析や多次元尺度法による分析を行い、その分析結果を視覚的に提示するようにしてもよい。例えば、主成分分析の場合には、出力部140は、第一主成分と第二主成分との二軸によって材料パラメータと情動言葉パラメータとの座標間の位置関係を、BiPlotによる二次元の図で提示するようにしてもよい。
For example, the output unit 140 may generate and present a map that allows visual understanding of the coordinates of the material parameters and the coordinates of the emotion word parameters. Also, for example, the output unit 140 may perform analysis using principal component analysis or multidimensional scaling on the coordinates of the material parameters and the coordinates of the emotion word parameters, and visually present the analysis results. For example, in the case of principal component analysis, the output unit 140 may present the positional relationship between the coordinates of the material parameters and the emotion word parameters using two axes, the first principal component and the second principal component, in a two-dimensional diagram using BiPlot.
以上説明したように、本開示の一実施形態における材料推奨装置100は、入力部120と、材料パラメータ抽出部131と、情動言葉パラメータ抽出部132と、推奨材料決定部134とを備える。入力部120は、製品種別と、情動を表現する言葉を示す情動言葉とを示す情報の入力を受け付ける。材料パラメータ抽出部131は、製品種別に属する製品に用いられる材料と材料に対して人間が受ける感覚の度合いを示す感覚パラメータの値とが対応付けられた材料パラメータ情報に基づいて、入力部120に入力された製品種別に対応する材料ごとの感覚パラメータ(材料パラメータ)の値を抽出する。情動言葉パラメータ抽出部132は、情動言葉と感覚パラメータの値とが対応付けられた情動言葉パラメータ情報に基づいて、入力部120に入力された情動言葉に対応する感覚パラメータ(情動言葉パラメータ)の値を抽出する。推奨材料決定部134は、材料パラメータ抽出部131によって抽出された材料ごとの感覚パラメータの値と、情動言葉パラメータ抽出部132によって抽出された感覚パラメータの値とに基づいて、入力部120に入力された情動言葉に対して推奨される材料を決定する。
As described above, the material recommendation device 100 in one embodiment of the present disclosure includes an input unit 120, a material parameter extraction unit 131, an emotion word parameter extraction unit 132, and a recommended material determination unit 134. The input unit 120 accepts input of information indicating a product type and an emotion word indicating a word expressing an emotion. The material parameter extraction unit 131 extracts a sensory parameter (material parameter) value for each material corresponding to the product type input to the input unit 120 based on material parameter information in which materials used in products belonging to the product type are associated with sensory parameter values indicating the degree of sensation humans experience from the material. The emotion word parameter extraction unit 132 extracts a sensory parameter (emotion word parameter) value corresponding to the emotion word input to the input unit 120 based on emotion word parameter information in which emotion words are associated with sensory parameter values. The recommended ingredient determination unit 134 determines ingredients recommended for the emotion word input to the input unit 120 based on the sensory parameter values for each ingredient extracted by the ingredient parameter extraction unit 131 and the sensory parameter values extracted by the emotion word parameter extraction unit 132.
このような構成を備えることで、材料推奨装置100は、例えば材料を推奨する特定の担当者のセンス等に頼ることなく、人の情動に応じて適切な材料を自動的に推奨することができる。
With this configuration, the material recommendation device 100 can automatically recommend appropriate materials according to a person's emotions, without relying on the sense of a specific person who recommends the materials.
また、材料推奨装置100は、クロスモーダル効果による影響を考慮するように、材料ごとの感覚パラメータの値及び情動言葉に対応する感覚パラメータの値を補正する。このような構成を備えることで、材料推奨装置100は、より精度高く、人の情動に応じて適切な材料を推奨することができる。
Furthermore, the material recommendation device 100 corrects the sensory parameter values for each material and the sensory parameter values corresponding to emotion words to take into account the influence of cross-modal effects. With this configuration, the material recommendation device 100 can more accurately recommend appropriate materials according to a person's emotions.
(変形例)
図1に示される材料推奨装置100の機能構成のうち、モーダル補正情報113、使用状況入力部123、及びモーダル補正部133が省略された、より簡易な構成にしてもよい。すなわち、材料推奨装置100が有する機能のうち、クロスモーダル効果を考慮して五感パラメータを補正する構成が省かれた材料推奨装置とすることも可能である。材料推奨装置100は、材料ごとの感覚パラメータの値と、感覚パラメータの値とを使用状況情報に応じて補正することなく、推奨される材料を決定してもよい。このように、より簡易な構成にすることで、材料推奨装置100の装置コスト、運用コスト、及びモーダル補正情報113を用意するための工数等を削減することができる。 (Modification)
Of the functional configuration of thematerial recommendation device 100 shown in FIG. 1, the modal correction information 113, the usage status input unit 123, and the modal correction unit 133 may be omitted to form a simpler configuration. In other words, it is also possible to form a material recommendation device from the functions of the material recommendation device 100, omitting the configuration for correcting the five sense parameters in consideration of the cross-modal effect. The material recommendation device 100 may determine a recommended material without correcting the sensory parameter values for each material and the sensory parameter values according to the usage status information. In this way, by making the configuration simpler, it is possible to reduce the device cost and operation cost of the material recommendation device 100, and the man-hours required to prepare the modal correction information 113, etc.
図1に示される材料推奨装置100の機能構成のうち、モーダル補正情報113、使用状況入力部123、及びモーダル補正部133が省略された、より簡易な構成にしてもよい。すなわち、材料推奨装置100が有する機能のうち、クロスモーダル効果を考慮して五感パラメータを補正する構成が省かれた材料推奨装置とすることも可能である。材料推奨装置100は、材料ごとの感覚パラメータの値と、感覚パラメータの値とを使用状況情報に応じて補正することなく、推奨される材料を決定してもよい。このように、より簡易な構成にすることで、材料推奨装置100の装置コスト、運用コスト、及びモーダル補正情報113を用意するための工数等を削減することができる。 (Modification)
Of the functional configuration of the
なお、実施形態における材料推奨装置100の機能を実現するためのプログラムをコンピュータ読み取り可能な記録媒体に記録して、この記録媒体に記録されたプログラムをコンピュータシステムに読み込ませ、実行することにより、上述した処理を行ってもよい。なお、ここでいう「コンピュータシステム」とは、OSや周辺機器等のハードウエアを含むものとする。また、「コンピュータシステム」は、ホームページ提供環境(あるいは表示環境)を備えたWWWシステムも含むものとする。また、「コンピュータ読み取り可能な記録媒体」とは、フレキシブルディスク、光磁気ディスク、ROM、CD-ROM等の可搬媒体、コンピュータシステムに内蔵されるハードディスク等の記憶装置のことをいう。更に「コンピュータ読み取り可能な記録媒体」とは、インターネット等のネットワークや電話回線等の通信回線を介してプログラムが送信された場合のサーバやクライアントとなるコンピュータシステム内部の揮発性メモリ(RAM)のように、一定時間プログラムを保持しているものも含むものとする。
The above-mentioned processing may be performed by recording a program for implementing the functions of the material recommendation device 100 in the embodiment on a computer-readable recording medium, and having the computer system read and execute the program recorded on the recording medium. Note that the term "computer system" as used herein includes hardware such as an OS and peripheral devices. The term "computer system" also includes a WWW system equipped with a homepage providing environment (or display environment). The term "computer-readable recording medium" refers to portable media such as flexible disks, optical magnetic disks, ROMs, and CD-ROMs, and storage devices such as hard disks built into a computer system. The term "computer-readable recording medium" also refers to storage devices that hold a program for a certain period of time, such as volatile memory (RAM) inside a computer system that becomes a server or client when a program is transmitted via a network such as the Internet or a communication line such as a telephone line.
また、上記プログラムは、このプログラムを記憶装置等に格納したコンピュータシステムから、伝送媒体を介して、あるいは、伝送媒体中の伝送波により他のコンピュータシステムに伝送されてもよい。ここで、プログラムを伝送する「伝送媒体」は、インターネット等のネットワーク(通信網)や電話回線等の通信回線(通信線)のように情報を伝送する機能を有する媒体のことをいう。また、上記プログラムは、前述した機能の一部を実現するためのものであっても良い。更に、前述した機能をコンピュータシステムに既に記録されているプログラムとの組み合わせで実現できるもの、いわゆる差分ファイル(差分プログラム)であっても良い。
The above program may also be transmitted from a computer system in which the program is stored in a storage device or the like to another computer system via a transmission medium, or by transmission waves in the transmission medium. Here, the "transmission medium" that transmits the program refers to a medium that has the function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line. The above program may also be one that realizes part of the above-mentioned functions. Furthermore, it may be a so-called difference file (difference program) that can realize the above-mentioned functions in combination with a program already recorded in the computer system.
100…材料推奨装置
110…パラメータ情報記憶部
111…材料パラメータ情報
112…情動言葉パラメータ情報
113…モーダル補正情報
120…入力部
121…製品種別入力部
122…情動言葉入力部
123…使用状況入力部
131…材料パラメータ抽出部
132…情動言葉パラメータ抽出部
133…モーダル補正部
134…推奨材料決定部
140…出力部 Reference Signs List 100: Material recommendation device 110: Parameter information storage unit 111: Material parameter information 112: Emotion word parameter information 113: Modal correction information 120: Input unit 121: Product type input unit 122: Emotion word input unit 123: Usage status input unit 131: Material parameter extraction unit 132: Emotion word parameter extraction unit 133: Modal correction unit 134: Recommended material determination unit 140: Output unit
110…パラメータ情報記憶部
111…材料パラメータ情報
112…情動言葉パラメータ情報
113…モーダル補正情報
120…入力部
121…製品種別入力部
122…情動言葉入力部
123…使用状況入力部
131…材料パラメータ抽出部
132…情動言葉パラメータ抽出部
133…モーダル補正部
134…推奨材料決定部
140…出力部 Reference Signs List 100: Material recommendation device 110: Parameter information storage unit 111: Material parameter information 112: Emotion word parameter information 113: Modal correction information 120: Input unit 121: Product type input unit 122: Emotion word input unit 123: Usage status input unit 131: Material parameter extraction unit 132: Emotion word parameter extraction unit 133: Modal correction unit 134: Recommended material determination unit 140: Output unit
Claims (9)
- 製品種別と、情動を表現する言葉を示す情動言葉と、を示す情報の入力を受け付ける入力部と、
前記製品種別に属する製品に用いられる材料と前記材料に対して人間が受ける感覚の度合いを示す感覚パラメータの値とが対応付けられた材料パラメータ情報に基づいて、前記入力部に入力された前記製品種別に対応する前記材料ごとの感覚パラメータの値を抽出する材料パラメータ抽出部と、
前記情動言葉と前記感覚パラメータの値とが対応付けられた情動言葉パラメータ情報に基づいて、前記入力部に入力された前記情動言葉に対応する感覚パラメータの値を抽出する情動言葉パラメータ抽出部と、
前記材料パラメータ抽出部によって抽出された前記材料ごとの感覚パラメータの値と、前記情動言葉パラメータ抽出部によって抽出された前記感覚パラメータの値と、に基づいて、前記入力部に入力された前記情動言葉に対して推奨される前記材料を決定する推奨材料決定部と、
を備える材料推奨装置。 an input unit that receives input of information indicating a product type and an emotion word indicating a word expressing an emotion;
a material parameter extraction unit that extracts a sensory parameter value for each material corresponding to the product type input to the input unit, based on material parameter information in which materials used in the product belonging to the product type are associated with sensory parameter values indicating the degree of sensation experienced by humans with respect to the material;
an emotion word parameter extraction unit that extracts a value of a sensory parameter corresponding to the emotion word input to the input unit based on emotion word parameter information in which the emotion word and the value of the sensory parameter are associated with each other;
a recommended ingredient determination unit that determines the ingredient recommended for the emotion word input to the input unit, based on the sensory parameter values for each ingredient extracted by the ingredient parameter extraction unit and the sensory parameter values extracted by the emotion word parameter extraction unit;
A material recommendation device comprising: - 前記情動言葉は、SD(Semantic Differential)法による印象評価において定義されるEPA(Evaluation, Potency, Activity)構造の因子のうち、評価性因子又は活動性因子に分類される語句である
請求項1に記載の材料推奨装置。 The material recommendation device according to claim 1 , wherein the emotion words are words classified as evaluation factors or activity factors among factors of an EPA (Evaluation, Potency, Activity) structure defined in impression evaluation by an SD (Semantic Differential) method. - 前記感覚パラメータの値を、複数種類の感覚でそれぞれ受ける印象が互いに関係しあうクロスモーダル効果を考慮する係数を用いて補正する補正部
をさらに備える請求項1又は2に記載の材料推奨装置。 The ingredient recommendation device according to claim 1 or 2, further comprising a correction unit that corrects the values of the sensory parameters using coefficients that take into account a cross-modal effect in which impressions received through a plurality of types of senses are related to each other. - 前記係数には、前記複数種類の感覚によってそれぞれ印象を受ける際の順序に応じて異なる値が設定されており、
前記補正部は、前記順序を示す情報を取得し、取得された前記順序に対応する前記係数を用いて補正する
請求項3に記載の材料推奨装置。 The coefficients are set to different values according to the order in which impressions are received by the plurality of types of senses,
The material recommendation device according to claim 3 , wherein the correction unit acquires information indicating the order, and performs correction using the coefficient corresponding to the acquired order. - 前記材料ごとの感覚パラメータの値の補正に用いられる前記係数には、先に受ける前記感覚の場合には1の値が設定され、後から受ける感覚の場合には0より大きく1未満の値が設定され、それ以外の感覚の場合には0が設定されている
請求項4に記載の材料推奨装置。 The material recommendation device according to claim 4, wherein the coefficient used to correct the value of the sensory parameter for each material is set to a value of 1 for the sensation received first, a value greater than 0 and less than 1 for the sensation received later, and a value of 0 for any other sensation. - 前記情動言葉の感覚パラメータの値の補正に用いられる前記係数には、前記クロスモーダル効果が生じる前記感覚の場合には1の値が設定され、前記クロスモーダル効果が生じない感覚の前記感覚の場合には0の値が設定されている
請求項4に記載の材料推奨装置。 The material recommendation device according to claim 4, wherein the coefficients used to correct the values of the emotion word sensory parameters are set to a value of 1 for the sensory parameters that cause the cross-modal effect, and are set to a value of 0 for the sensory parameters that do not cause the cross-modal effect. - 前記感覚とは、五感又は前記五感がさらに細分化された感覚である
請求項1又は2に記載の材料推奨装置。 The material recommendation device according to claim 1 or 2, wherein the senses are the five senses or senses further subdivided from the five senses. - 製品種別と、情動を表現する言葉を示す情動言葉と、を示す情報の入力を受け付ける入力ステップと、
前記製品種別に属する製品に用いられる材料と前記材料に対して人間が受ける感覚の度合いを示す感覚パラメータの値とが対応付けられた材料パラメータ情報に基づいて、前記入力ステップにおいて入力された前記製品種別に対応する前記材料ごとの感覚パラメータの値を抽出する材料パラメータ抽出ステップと、
前記情動言葉と前記感覚パラメータの値とが対応付けられた情動言葉パラメータ情報に基づいて、前記入力ステップにおいて入力された前記情動言葉に対応する感覚パラメータの値を抽出する情動言葉パラメータ抽出ステップと、
前記材料パラメータ抽出ステップにおいて抽出された前記材料ごとの感覚パラメータの値と、前記情動言葉パラメータ抽出ステップにおいて抽出された前記感覚パラメータの値と、に基づいて、前記入力ステップにおいて入力された前記情動言葉に対して推奨される前記材料を決定する推奨材料決定ステップと、
を有する材料推奨方法。 an input step of receiving an input of information indicating a product type and an emotion word indicating a word expressing an emotion;
a material parameter extraction step of extracting a sensory parameter value for each material corresponding to the product type input in the input step, based on material parameter information in which materials used in the product belonging to the product type are associated with sensory parameter values indicating the degree of sensation experienced by humans with respect to the material;
an emotion word parameter extraction step of extracting a value of a sensory parameter corresponding to the emotion word input in the input step, based on emotion word parameter information in which the emotion word and the value of the sensory parameter are associated with each other;
a recommended ingredient determination step of determining the ingredient recommended for the emotion word input in the input step, based on the sensory parameter values for each ingredient extracted in the ingredient parameter extraction step and the sensory parameter values extracted in the emotion word parameter extraction step;
A material recommendation method having: - コンピュータに、
製品種別と、情動を表現する言葉を示す情動言葉と、を示す情報の入力を受け付ける入力ステップと、
前記製品種別に属する製品に用いられる材料と前記材料に対して人間が受ける感覚の度合いを示す感覚パラメータの値とが対応付けられた材料パラメータ情報に基づいて、前記入力ステップにおいて入力された前記製品種別に対応する前記材料ごとの感覚パラメータの値を抽出する材料パラメータ抽出ステップと、
前記情動言葉と前記感覚パラメータの値とが対応付けられた情動言葉パラメータ情報に基づいて、前記入力ステップにおいて入力された前記情動言葉に対応する感覚パラメータの値を抽出する情動言葉パラメータ抽出ステップと、
前記材料パラメータ抽出ステップにおいて抽出された前記材料ごとの感覚パラメータの値と、前記情動言葉パラメータ抽出ステップにおいて抽出された前記感覚パラメータの値と、に基づいて、前記入力ステップにおいて入力された前記情動言葉に対して推奨される前記材料を決定する推奨材料決定ステップと、
を実行させるためのプログラム。 On the computer,
an input step of receiving an input of information indicating a product type and an emotion word indicating a word expressing an emotion;
a material parameter extraction step of extracting a sensory parameter value for each material corresponding to the product type input in the input step, based on material parameter information in which materials used in the product belonging to the product type are associated with sensory parameter values indicating the degree of sensation experienced by humans with respect to the material;
an emotion word parameter extraction step of extracting a value of a sensory parameter corresponding to the emotion word input in the input step, based on emotion word parameter information in which the emotion word and the value of the sensory parameter are associated with each other;
a recommended ingredient determination step of determining the ingredient recommended for the emotion word input in the input step, based on the sensory parameter values for each ingredient extracted in the ingredient parameter extraction step and the sensory parameter values extracted in the emotion word parameter extraction step;
A program for executing.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2024502427A JPWO2024071072A1 (en) | 2022-09-26 | 2023-09-26 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022-152385 | 2022-09-26 | ||
JP2022152385 | 2022-09-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024071072A1 true WO2024071072A1 (en) | 2024-04-04 |
Family
ID=90477878
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/034824 WO2024071072A1 (en) | 2022-09-26 | 2023-09-26 | Material recommendation device, material recommendation method, and program |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2024071072A1 (en) |
WO (1) | WO2024071072A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010092423A (en) * | 2008-10-10 | 2010-04-22 | Japan Science & Technology Agency | Approximate tactile material recommendation system |
WO2017026146A1 (en) * | 2015-08-10 | 2017-02-16 | ソニー株式会社 | Information processing device, information processing method, and program |
WO2020012788A1 (en) * | 2018-07-09 | 2020-01-16 | 株式会社 資生堂 | Information processing device, device for generating substance to be applied, and program |
-
2023
- 2023-09-26 JP JP2024502427A patent/JPWO2024071072A1/ja active Pending
- 2023-09-26 WO PCT/JP2023/034824 patent/WO2024071072A1/en unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010092423A (en) * | 2008-10-10 | 2010-04-22 | Japan Science & Technology Agency | Approximate tactile material recommendation system |
WO2017026146A1 (en) * | 2015-08-10 | 2017-02-16 | ソニー株式会社 | Information processing device, information processing method, and program |
WO2020012788A1 (en) * | 2018-07-09 | 2020-01-16 | 株式会社 資生堂 | Information processing device, device for generating substance to be applied, and program |
Also Published As
Publication number | Publication date |
---|---|
JPWO2024071072A1 (en) | 2024-04-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Bruun et al. | Measuring the coolness of interactive products: the COOL questionnaire | |
Moriuchi et al. | Engagement with chatbots versus augmented reality interactive technology in e-commerce | |
Uzir et al. | Applied Artificial Intelligence and user satisfaction: Smartwatch usage for healthcare in Bangladesh during COVID-19 | |
Fock et al. | The moderating effect of collectivistic orientation in psychological empowerment and job satisfaction relationship | |
Park et al. | Consumer acceptance of self-service technologies in fashion retail stores | |
Kim et al. | Adoption of sensory enabling technology for online apparel shopping | |
Mentes et al. | Assessing the usability of university websites: An empirical study on Namik Kemal University. | |
Spiekermann et al. | Motivating human–agent interaction: Transferring insights from behavioral marketing to interface design | |
Kim et al. | Identifying affect elements based on a conceptual model of affect: A case study on a smartphone | |
WO2014034008A1 (en) | Concentration ratio measurement device and program | |
Santos et al. | Functional, psychological and emotional barriers and the resistance to the use of digital banking services | |
Alwi | ONLINE CORPORATE BRAND IMAGES AND CONSUMER LOYALTY. | |
CN113255052A (en) | Home decoration scheme recommendation method and system based on virtual reality and storage medium | |
Prastawa et al. | The effect of cognitive and affective aspects on usability | |
CN104428804A (en) | Method and apparatus for rating objects | |
Perry | Factors comprehensively influencing acceptance of 3D-printed apparel | |
Yu et al. | Consumers' virtual product experiences and risk perceptions of product performance in the online co‐design practice: A case of NIKEiD | |
Palacios-Ibáñez et al. | The influence of hand tracking and haptic feedback for virtual prototype evaluation in the product design process | |
Camargo et al. | Beyond usability: designing for consumers' product experience using the Rasch model | |
Kim et al. | Adoption of dynamic product imagery for online shopping: does age matter? | |
Qu et al. | Effective use of human physiological metrics to evaluate website usability: An empirical investigation from China | |
Dillon et al. | Sensing the fabric: To simulate sensation through sensory evaluation and in response to standard acceptable properties of specific materials when viewed as a digital image | |
Choi | Material Selection: Exploring the Reliability of Material Perception Data and Its Relevance to Materials | |
US8160918B1 (en) | Method and apparatus for determining brand preference | |
Kim et al. | Toward an evaluation model of user experiences on virtual reality indoor bikes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2024502427 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23872295 Country of ref document: EP Kind code of ref document: A1 |