WO2022097457A1 - Design evaluating device, design evaluating system, design evaluating method, design evaluating program, and learning device - Google Patents

Design evaluating device, design evaluating system, design evaluating method, design evaluating program, and learning device Download PDF

Info

Publication number
WO2022097457A1
WO2022097457A1 PCT/JP2021/038376 JP2021038376W WO2022097457A1 WO 2022097457 A1 WO2022097457 A1 WO 2022097457A1 JP 2021038376 W JP2021038376 W JP 2021038376W WO 2022097457 A1 WO2022097457 A1 WO 2022097457A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
user
design
information
sensitivity
Prior art date
Application number
PCT/JP2021/038376
Other languages
French (fr)
Japanese (ja)
Inventor
真一郎 斉藤
暁 井上
勝一 浦谷
健二 寺田
伸彦 高嶋
麻由香 加羽澤
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2022560698A priority Critical patent/JPWO2022097457A1/ja
Publication of WO2022097457A1 publication Critical patent/WO2022097457A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to a design evaluation device, a design evaluation system, a design evaluation method, a design evaluation program, and a learning device.
  • Patent Document 1 an image of an object to be evaluated and an item for evaluating the sensitivity of the image are presented to the subject, and the subject is made to evaluate the image, and the subject evaluates the image. While doing so, a technique for measuring the gaze transition of a subject is disclosed. The purpose of this technique is to calculate a unified sensibility evaluation for the image by adding the subject's line-of-sight information to the variation in the sensibility evaluation between the subjects.
  • a technique for calculating the correlation between the subject's line-of-sight position on the image of the evaluation object and the prominence of the evaluation object (which indicates the ease of visual attention of humans and is also called "salience") is known.
  • a line-of-sight tracking device 3 for measuring a subject's line of sight, a physiological index measuring device 4 for measuring a subject's physiological index, and a subject's psychological state are estimated from the subject's physiological index.
  • a line-of-sight analysis system 1 including at least a line-of-sight analysis device 2 developed so as to be able to analyze the line-of-sight of the subject in association with the psychological state of the subject is disclosed.
  • the concentration of the subject's line-of-sight position is equal to or higher than the concentration threshold and the degree of saliency (hereinafter referred to as "severity") is equal to or higher than the saliency threshold, the frequency of the line of sight is high.
  • severeity the degree of saliency
  • Patent Document 1 aims to reduce the variation in evaluation among individuals with respect to the image of the evaluation target. Therefore, when a seller or the like formulates a sales strategy in consideration of the subjectivity of an individual consumer, there is a problem that it is difficult to understand what kind of individual sensibility works on the image of the evaluation target.
  • an object of the present invention is to provide a design evaluation device, a design evaluation system, a design evaluation method, a design evaluation program, and a learning device capable of visualizing a user's feelings regarding an image of an evaluation object.
  • An image acquisition unit that acquires at least one image of an evaluation target, an attraction degree estimation unit that estimates the degree of attraction of the image based on the characteristics of the image, and a display for displaying the image on a display device.
  • a control unit an expression providing unit that provides a plurality of expressions related to the design of the evaluation object to the user, a line-of-sight information acquisition unit that acquires information about the user's line of sight with respect to the image displayed on the display device, and the above. Difference between the attractiveness measuring unit that measures the attractiveness of the image based on the information about the line of sight, the estimated attractiveness estimated by the attractiveness estimation unit, and the measured attractiveness measured by the attractiveness measuring unit.
  • the difference calculation unit for calculating the difference
  • the sensitivity information acquisition unit for acquiring the sensitivity information indicating the relationship between the expression and the user's sensitivity regarding the design, and the difference and the plurality of expressions using the sensitivity information.
  • Sensitivity extraction unit that extracts relevance and design evaluation device.
  • the difference calculation unit sets a plurality of regions that do not overlap each other with respect to the image, calculates the difference in each region, and the sensitivity extraction unit calculates the difference with the difference in each region.
  • the design evaluation device which extracts the relationship with the plurality of expressions.
  • the attraction degree conversion value calculation unit for calculating the attraction degree conversion value based on the measurement attraction degree and the sensitivity information in each of the regions is further provided, and the sensitivity information acquisition unit has the above-mentioned design.
  • the user's response to the expression is received, the sensitivity information is acquired based on the response, and the sensitivity extraction unit extracts the relationship based on the correlation between the difference and the attraction conversion value.
  • the sensitivity extraction unit relates the difference to the plurality of expressions and information regarding the user's attributes in each of the regions.
  • (6) Further has an attribute information acquisition unit for acquiring information related to the user's attributes.
  • the sensitivity extraction unit uses the sensitivity information and information on the user's attributes as explanatory variables, and machine learning using the difference as an objective variable.
  • the design evaluation device according to (2) above which extracts the relationship using the machine learning model obtained in the above.
  • the Kansei information acquisition unit acquires the user's evaluation regarding the design, and calculates the Kansei information based on the user's evaluation, in any one of the above (1) to (8). Described design evaluation device.
  • the Kansei information acquisition unit acquires the response time required for the user to complete the response to the expression, and calculates the Kansei information based on the response time (1). )-(8).
  • the design evaluation device according to any one of (8).
  • the line-of-sight information acquisition unit continuously provides information on the line of sight from the time the expression is provided to the user by the expression providing unit until the user's response to the expression is received by the sensitivity information acquisition unit.
  • the design evaluation device according to any one of (1) to (10) above.
  • the image acquisition unit acquires a plurality of images of the evaluation target, and the attraction degree estimation unit estimates the attraction degree of each image based on the characteristics of each image, and displays the display.
  • the control unit displays each of the images on the display device, and the attraction degree measuring unit measures the degree of attraction of each of the images based on the information regarding the line of sight.
  • the image acquisition unit acquires an image of the evaluation object to which decorative printing is applied and an image of the evaluation object to which the decorative printing is not applied. Described design evaluation device.
  • the display device a line-of-sight measuring device that measures the user's line of sight with respect to the image displayed on the display device, and outputs information about the line of sight to the line-of-sight information acquisition unit, and the above (1) to (1).
  • a design evaluation system comprising the design evaluation device according to any one of 15).
  • the step (g) for calculating the difference from the degree of attraction the step (h) for acquiring the sensitivity information indicating the relationship between the expression and the user's sensitivity regarding the design, and the difference with the difference using the sensitivity information.
  • a design evaluation method comprising the step (i) of extracting the relationship with the plurality of expressions.
  • step (g) In the step (g), a plurality of regions that do not overlap each other are set for the image, the difference is calculated in each region, and in the step (i), the region is described.
  • step (i) Prior to the step (i), there is further a step (j) for acquiring information regarding the user's attributes, and in the step (i), the sensitivity information and the information regarding the user's attributes are explanatory variables.
  • a design evaluation program for causing a computer to execute the process included in the design evaluation method according to any one of (17) to (19) above.
  • the sensitivity information and the attraction based on the storage unit that stores the learning data including the sensitivity information as the explanatory variable and the attraction degree difference as the objective variable, and the learning data stored in the storage unit.
  • a learning device having a model generation unit that generates a machine learning model by machine learning the relationship with the degree difference.
  • the degree of attraction based on the user's sensibilities is calculated, and the relationship between the degree of attraction and a plurality of expressions related to the image of the evaluation object is extracted. You can visualize the user's feelings about the image of.
  • FIG. 1 It is a schematic diagram which looked at the design evaluation system which concerns on 1st Embodiment from above. It is a block diagram exemplifying the schematic hardware configuration of the design evaluation system shown in FIG. 1. It is a functional block diagram illustrating the main function of the control device shown in FIG. It is a figure which exemplifies the sensitivity question item about the design proposal of a food package. It is a flowchart which illustrates the design evaluation method of the control device which concerns on 1st Embodiment. It is a flowchart following FIG. 5A. It is a schematic diagram which illustrates the design proposal of the evaluation object. It is a schematic diagram which illustrates the plurality of areas 1 to 4 set for the image of a design proposal.
  • FIG. 23A It is a schematic diagram which illustrates the design proposal 1 of the evaluation object. It is a schematic diagram which illustrates the area setting of the image of the design proposal 1.
  • FIG. 1 is a schematic view of the design evaluation system 100 according to the first embodiment as viewed from above
  • FIG. 2 is a block diagram illustrating a schematic hardware configuration of the design evaluation system 100 shown in FIG. ..
  • FIG. 3 is a functional block diagram illustrating the main functions of the control device 300 shown in FIG.
  • FIG. 4 is a diagram illustrating the sensitivity question items regarding the design proposal of the food package.
  • the design evaluation system 100 of the present embodiment presents an image 30 of the evaluation object and words 40 relating to the design of the evaluation object, and with respect to the image 30, an attractiveness (visual attention) based on the user 10's sensibilities. Ease) and the relationship between the plurality of words 40 are extracted. As a result, among these words 40, the words 40 that are in harmony with the sensibility of the user 10 are determined with respect to the design of the evaluation object.
  • the evaluation object may be, for example, a product designed by a designer, a product package, an advertisement, or the like.
  • the design evaluation system 100 includes a line-of-sight measuring device 200, a control device 300, a display device 400, an input device 500, an audio output device 600, and a communication device 700. These configurations are communicably connected to each other.
  • the line-of-sight measuring device 200 continuously measures the line-of-sight 20 of the user 10 and transmits information about the line-of-sight to the control device 300.
  • the control device 300 functions as a design evaluation device, causes the display 410 of the display device 400 to display the image 30 of the evaluation target, and presents the image 30 to the user 10.
  • the information regarding the line of sight is, for example, the position (hereinafter referred to as “line-of-sight position”) 50 that the user 10 is looking at on the image 30 of the evaluation object, the line-of-sight direction of the left and right eyes of the user 10, or the left and right eyes of the user 10. It can be the position of the pupil of.
  • the line-of-sight measuring device 200 can be attached to the head of the user 10, for example.
  • a known eye tracking technique can be adopted for the line-of-sight measurement by the line-of-sight measuring device 200.
  • the eye tracking technique may be a non-contact technique such as a corneal reflex method or a scleral reflex method.
  • a contact-type technique such as a search coil method or an eye potential method, or a line-of-sight measurement method other than these may be used.
  • the line-of-sight measuring device 200 When measuring the line of sight using the corneal reflex method, the line-of-sight measuring device 200 irradiates each of the left and right eyes of the user 10 with weak near-infrared light, and images the left and right eyes of the user 10 with a camera, for example. .. Then, the line-of-sight measuring device 200 calculates the center position of the reflected light of the near-infrared light on the corneal surface and the center position of the pupil of the eye for each of the left and right eyes of the user 10 from the image of the camera, and based on these. The line of sight 20 of the user 10 is detected. As a result, the line of sight 20 of the user 10 is accurately measured while the user 10 is gazing at the image of the evaluation object.
  • the line-of-sight measuring device 200 calculates the line-of-sight position of the user 10 in the image 30 of the evaluation object based on the information on the line-of-sight direction of the user 10 in the three-dimensional space, the center positions of the pupils of the left and right eyes of the user 10, and the like. ..
  • the line-of-sight directions of the left and right eyes of the user 10, the center positions of the pupils of the left and right eyes of the user 10, and the like are transmitted from the line-of-sight measuring device 200 to the control device 300, and the control device 300 relates to the line of sight. It may be configured to calculate the line-of-sight position 50 based on the information.
  • the control device 300 functions as a design evaluation device and controls the design evaluation system 100 in an integrated manner.
  • the control device 300 may be a personal computer, a smartphone, a PDA (Personal Digital Assistant) in which a design evaluation program described later is installed, a main body of a tablet terminal (control device), or the like.
  • the control device 300 includes a CPU (Central Processing Unit) 310, a RAM (Random Access Memory) 320, a ROM (Read Only Memory) 330, an auxiliary storage unit 340, and the like.
  • the CPU 310 executes an OS (Operating System) and a design evaluation program deployed in the RAM 320, and controls the operation of the line-of-sight measuring device 200, the display device 400, the input device 500, the voice output device 600, and the communication device 700.
  • the design evaluation program is stored in the ROM 330 or the auxiliary storage unit 340 in advance.
  • the RAM 320 stores data or the like temporarily generated by the processing of the CPU 310.
  • the ROM 330 stores a program executed by the CPU 310, data used for executing the program, parameters, and the like.
  • the auxiliary storage unit 340 has, for example, an HDD (Hard Disk Drive), an SSD (Solid State Drive), or the like, and can store a plurality of images of an evaluation target.
  • the evaluation target is imaged by a camera, a scanner, or the like, and for example, JPEG (Joint Photographic Experts Group) format, TIFF (tag image file format) format, PNG (Portable Network image storage), etc. Has been done.
  • the auxiliary storage unit 340 stores a plurality of words presented to the user 10. Details of these terms will be described later.
  • the auxiliary storage unit 340 also stores information regarding the attributes of the user 10 when performing a monitor test described later.
  • the display device 400 includes a display 410, and is used to present an image of the evaluation object and words related to the design proposal of the evaluation object to the user 10. Further, after the user 10 evaluates the design proposal of the evaluation target object, the graph of the correlation between the attractiveness difference and the attractiveness conversion value (for example, FIGS. 18A to 18C), which will be described later, and the attractiveness difference The extraction result (for example, FIG. 20) of the relationship with the plurality of sensitivity question items is displayed on the display 410.
  • the input device 500 includes, for example, a keyboard, a mouse, and the like, and is used for the user 10 to perform various instructions (inputs) such as character input by the keyboard, the mouse, and various settings. In the present embodiment, it is used for inputting information (described later) regarding the attributes of the user 10 in the monitor test, input when the user 10 evaluates the design proposal of the evaluation object, and the like. Further, the input device 500 has a timekeeping ability, and after the image of the evaluation object and the words related to the design proposal of the evaluation object are displayed on the display of the display device 400, the user 10 uses the evaluation object. The time taken to complete the evaluation of the design proposal (hereinafter referred to as "response time”) is measured.
  • response time The time taken to complete the evaluation of the design proposal
  • the voice output device 600 has a speaker, and can provide the user 10 with words related to the design proposal of the evaluation object by voice.
  • the communication device 700 has, for example, a communication circuit such as a network interface card (NIC: Network Interface Card), and transmits data to and from an external device through a communication network (not shown).
  • NIC Network Interface Card
  • the CPU 310 executes the design evaluation program, so that the image acquisition unit 351 and the attribute information acquisition unit 352, the attraction degree estimation unit 353, the attraction degree measurement unit 354, and the difference calculation unit 355 are executed.
  • the image acquisition unit 351 acquires at least one image of the evaluation target object stored in the auxiliary storage unit 340, for example. Further, the image acquisition unit 351 can also acquire an image of the evaluation target object from an external server or the like via the communication device 700.
  • the attribute information acquisition unit 352 acquires the attributes and personas of the user 10 (hereinafter referred to as "information regarding the attributes of the user 10").
  • the attributes of the user 10 can be classified into, for example, demographic attributes, psychological attributes, and behavioral attributes.
  • Demographic attributes include, for example, attributes such as age (age), gender, and place of residence (place of residence, place of work).
  • Psychological attributes include, for example, attributes such as values and propensity to consume.
  • the behavioral attributes include, for example, attributes such as product purchase history and action range.
  • the attribute information acquisition unit 352 displays a questionnaire or the like including a question on the display device 400, and prompts the user 10 to answer the questionnaire.
  • the questionnaire may include, for example, questions such as the age (age) of the user 10 as a demographic attribute, gender, and the consumption behavior of the user 10 as a psychological attribute. Questions about consumer behavior include, for example, whether they often buy things that others buy or branded things.
  • attributes by analyzing the behavior (behavior) of the user 10 based on the image information or the like collected in advance. For example, it is possible to estimate the age from the image of the face of the user 10 or to judge the personality or the like from the behavior of the user 10 or the movement of the line of sight. For example, if the camera in the store captures the behavior of the user 10 in front of the display shelf and the user 10 spends a lot of time looking at the description of the product package before purchasing, he or she thinks carefully or logically. Can be determined to be. Further, by using the questionnaire and the analysis of the behavior of the user 10 together, it is possible to improve the accuracy of determining (acquiring) the attribute.
  • the persona is, for example, a fictitious consumer image of purchasing products and services related to occupations, job titles, annual income, hobbies, values, lifestyles, etc.
  • Information about the attributes of the user 10 is stored in, for example, the auxiliary storage unit 340.
  • the attraction degree estimation unit 353 estimates (simulates) the degree of prominence of the image of the evaluation target.
  • the ease of visual attention of a human being due to the characteristics of an image is referred to as "significance”
  • the numerical value representing the degree of remarkableness is referred to as "significance”.
  • the part where the human retina reacts strongly when the human looks at the evaluation object has a high degree of prominence.
  • the attractiveness estimation unit 353 generates a saliency map showing the distribution of saliency with respect to the entire image of the evaluation target. When the saliency map is displayed in color, the portion having high saliency is displayed in a warm color such as red, and the portion having low saliency is displayed in a cool color. Blue corresponds to the ground level with the lowest saliency.
  • a salience map of the image of the evaluation object is generated.
  • a method for generating a saliency map is also described in, for example, Japanese Patent Application Laid-Open No. 2019-119164, and since it is a known technique, detailed description thereof will be omitted, but the outline thereof is as follows.
  • the attraction degree estimation unit 353 creates a first image group in which a plurality of images obtained by sampling the image of the evaluation object a plurality of times every other pixel and enlarging them to the same size as the image of the evaluation object are created.
  • a second image group is created by extracting the edge features of each image included in one image group.
  • the attraction degree estimation unit 353 creates a third image group that extracts the characteristics of the brightness of each image included in the first image group, and R (red), G (red), G (red), G (red) of each image included in the first image group.
  • a fourth image group is created by extracting the characteristics of the hues of green), B (blue), and Y (yellow).
  • the attraction degree estimation unit 353 calculates the degree of saliency for each unit region based on at least one of the second image group, the third image group, or the fourth image group. Further, the attraction degree estimation unit 353 has a fifth image group from which the differences between the R (red) and G (green) images included in the fourth image group are extracted, and B (blue) included in the fourth image group. A sixth image group obtained by extracting the difference between the Y (yellow) images is created. Then, the attraction degree estimation unit 353 creates a saliency map by normalizing and adding up the second image group, the third image group, the fifth image group, and the sixth image group.
  • the attraction degree estimation unit 353 calculates the estimated attraction degree for each area of a plurality of areas (hereinafter, also referred to as "areas") set in the image of the evaluation target object based on the degree of prominence. The method of calculating the estimated attractiveness will be described later.
  • the attractiveness measuring unit 354 measures the attractiveness of the image of the evaluation target. On the image of the evaluation object, the line-of-sight position of the user is concentrated on a part that stands out due to the characteristics of the image, such as the color and shape of the object, that is, a part that has a high degree of prominence and a part that the user is interested in. Therefore, it is considered that the attractiveness of these parts (or areas) is enhanced.
  • the attraction degree measuring unit 354 calculates the attraction degree (referred to as “measurement attraction degree”) in each area based on the information about the line of sight, that is, the result of the line-of-sight measurement, and stores it in the RAM 320. The details of the method of calculating the degree of attraction for measurement will be described later.
  • the difference calculation unit 355 calculates the difference between the measurement attraction degree calculated by the attraction degree measurement unit 354 and the estimated attraction degree estimated by the attraction degree estimation unit 353. More specifically, the difference calculation unit 355 calculates the difference between the measured attractiveness and the estimated attractiveness (hereinafter, also referred to as “attracting difference”) for each area and stores it in the RAM 320.
  • the display control unit 356 controls the display of the image of the evaluation target on the display 410 of the display device 400.
  • the display control unit 356 also controls the display of words provided by the expression providing unit 357.
  • the display control unit 356 displays, for example, an image and words of the evaluation target on the display 410 at the same time.
  • the expression providing unit 357 provides the user 10 with words regarding the design proposal of the evaluation target object.
  • the words can be appropriately selected according to the type, field, content, etc. of the design proposal of the evaluation target in consideration of the emotional matters that the designer or product maker of the evaluation target wants to appeal to the consumer. Therefore, this word can be a word corresponding to the user 10's sensibility regarding the design of the evaluation object. For example, when the evaluation target is a food package, words such as "delicious”, which is a word related to taste, and "like”, which is a word related to the taste of the user 10, can be selected.
  • the words may be configured to be preset by the performer of the monitor test or the like when the monitor test is performed by the subject.
  • the expression providing unit 357 can provide the user 10 with a plurality of words regarding the design proposal of the evaluation target object. These words are set as "sensitivity question items" in the monitor test.
  • FIG. 4 exemplifies the three words “innovative”, “delicious”, and “like” regarding the design proposal of a food package.
  • the word presented to the user 10 may be a word, a clause, or a sentence including a question, an adjective, a feature of the evaluation object, and the like. Further, instead of displaying the words on the display 410 of the display device 400, or in addition to displaying the words on the display 410, the words and sounds may be provided to the user 10 by voice by the speaker of the voice output device 600. Further, the user 10 may be provided with at least one of tactile, olfactory, and taste information in place of or in addition to visual or auditory information such as words.
  • tactile, olfactory, and taste information is referred to as "expression”. Expressions can be sensitive question items.
  • the case where the expression is a word will be mainly described, but the case where the expression is something other than a word, for example, a smell or a taste, is the same as the case of a word.
  • the line-of-sight information acquisition unit 358 acquires information on the line-of-sight of the user 10 with respect to the image of the evaluation object displayed on the display device 400.
  • the line-of-sight information acquisition unit 358 acquires, for example, the line-of-sight position of the user 10 on the image of the evaluation target as information on the line-of-sight from the line-of-sight measuring device 200.
  • the Kansei information acquisition unit 359 calculates the Kansei information of the user 10 regarding the design of the evaluation target object.
  • the evaluation of the design proposal is performed, for example, by the user 10 answering the score of the design proposal for each sensitivity question item.
  • the user 10 inputs the score of the design proposal by using the keyboard and the mouse of the input device 500.
  • the Kansei information acquisition unit 359 receives the evaluation of the user 10 for each Kansei question item, calculates the Kansei information indicating the relationship between each Kansei question item and the user 10's Kansei regarding the design proposal based on the evaluation, and RAM320. Save to.
  • words are provided by the expression providing unit 357, and the words are continuously displayed on the display 410 of the display device 400 until the user 10's evaluation of the words is completed. Get information about the line of sight. That is, the presentation of words by the expression providing unit 357, the acceptance of the evaluation of the user 10 by the sensitivity information acquisition unit 359, and the measurement of the line of sight 20 by the line of sight measuring device 200 can be executed in parallel.
  • the attraction degree conversion value calculation unit 360 calculates the attraction degree conversion value based on the measured attraction degree and the sensitivity information in each area, and stores it in the RAM 320. The details of the method of calculating the attractiveness conversion value will be described later.
  • the Kansei extraction unit 361 extracts the relationship between the attractiveness difference and the plurality of Kansei question items (plural expressions) 41 to 43.
  • the sensitivity extraction unit 361 extracts, for example, the relationship between the attractiveness difference and a plurality of words based on the correlation between the attractiveness difference and the attractiveness conversion value.
  • the sensitivity extraction unit 361 can also extract the above-mentioned relationship by a statistical method or a machine learning method described later.
  • the sensitivity output unit 362 outputs a graph of the correlation between the attraction degree difference and the attraction degree conversion value, the extraction result of the relationship between the attraction degree difference and the plurality of sensitivity question items 41 to 43, and the like on the display 410.
  • FIG. 5A and 5B are flowcharts illustrating the design evaluation method of the control device 300 according to the first embodiment
  • FIG. 6 is a schematic diagram illustrating a design proposal of the evaluation object.
  • the processing of the flowcharts of FIGS. 5A and 5B is realized by the CPU 310 executing the design evaluation program.
  • FIG. 7 is a schematic diagram illustrating a plurality of areas 1 to 4 set for the image of the design proposal
  • FIG. 8 is a diagram illustrating a salency map of the image of the design proposal
  • FIG. 9 is a diagram. It is a figure which exemplifies the estimated degree of attraction in the area 1 to 4 of a design proposal. Further, FIG.
  • FIG. 10 is a schematic diagram illustrating a case where the image of the design proposal is displayed in a standard state
  • FIG. 11 is a schematic diagram illustrating the distribution of the line-of-sight position of the user 10 who is viewing the image of the design proposal.
  • .. 12 and 13 are diagrams illustrating the measurement attraction degree and the attraction degree difference in areas 1 to 4 of the image of the design proposal, respectively
  • FIGS. 14A to 14C show the image of the design proposal and the display of each sensitivity question item.
  • FIG. 15 is a diagram illustrating the degree of attraction for measurement in areas 1 to 4 of the image of the design proposal for each sensitivity question item
  • FIG. 16 illustrates the score input for the design proposal by the user 10 for each sensitivity question item.
  • FIG. 17 is a diagram illustrating the attractiveness conversion value in the areas 1 to 4 of the image of the design proposal for each sensitivity question item.
  • 18A to 18C are graphs illustrating the correlation between the attractiveness difference and the attractiveness conversion value for each sensitivity question item.
  • the design evaluation system 100 displays an image of the design proposal and words related to the design on the display 410, and with respect to the image of the design proposal, the degree of attraction based on the user 10's sensibilities and the relationship with a plurality of words. To extract. More specifically, it is as follows.
  • an image of the evaluation target is acquired (step S101).
  • the image acquisition unit 311 acquires, for example, an image of a package of a certain food among a plurality of images stored in the auxiliary storage unit 340 as an image of the evaluation target (see FIG. 6).
  • the image of the design proposal of this evaluation object is referred to as a "design proposal image”.
  • the design proposal of FIG. 6 is, for example, a novel design proposal that combines an “hourglass” and a “food ingredient”.
  • the attribute information acquisition unit 352 reads, for example, information about the attributes of the user 10 from the auxiliary storage unit 340.
  • the information regarding the attributes of the user 10 includes information such as gender: female, age: 20s, location area: Tokyo, and the like.
  • the attraction degree estimation unit 353 identifies an object (character, figure, etc.) in the design proposal image 30 by using, for example, a known image recognition technique, and designs based on the identification result.
  • a plurality of areas (for example, areas 1 to 4) divided so as not to overlap each other are set for the draft image 30.
  • Area 1 is an area including the character string "hourglass of salt ⁇ " at the right end of the design proposal image 30, and area 2 is an area including an illustration of "hourglass" at the center of the design proposal image 30.
  • the area 3 is an area including the character string "0000OO / CORPORATION" at the lower part of the design proposal image 30, and the area 4 is a manufactoring factor "oysters and lemons" at the left end of the design proposal image 30. It is an area containing the character string.
  • the attractiveness estimation unit 353 generates a saliency map for the design proposal image 30.
  • the outline of the illustration of the "hourglass” in area 2 (A7) and the text (A8) of the "mariage of oysters and lemons" in area 4 have a medium degree of prominence.
  • the portions A1 to A6 having high saliency may be displayed in colors such as red and yellow, and the portions A7 and A8 having moderate saliency may be displayed in colors such as green.
  • the part of the design proposal image 30 with a high degree of prominence is considered to be highly attractive because it attracts the attention of the user 10.
  • the attraction degree estimation unit 353 calculates the estimated attraction degree in each area.
  • the estimated degree of attraction in each area represents the degree of ease of visual attention of the user 10 due to the characteristics of the design proposal image 30, for example, the degree of prominence in each area for all areas. It can be defined as a percentage of the sum of saliency. Therefore, the higher the degree of saliency, the higher the estimated degree of attraction in each area, and the lower the degree of saliency, the lower the estimated degree of attraction in each area.
  • the total estimated attraction is 1.0 for all areas.
  • FIG. 9 illustrates the estimated degree of attraction in areas 1 to 4.
  • the estimated attractiveness in area 2 is 0.52, which is higher than the estimated attractiveness in other areas 1, 3 and 4. Therefore, it is considered that the area 2 is more likely to attract the attention of the user 10 than the other areas 1, 3 and 4. That is, the design proposal image 30 originally has an attractiveness of more than 50% in the area 2 due to the prominence of the area 2. Therefore, it is considered that the area 2 of the design proposal image 30 attracts the attention of not only the user 10 but also other users.
  • step S104 the image of the evaluation target is displayed (step S104).
  • the display control unit 356 controls the design proposal image 30 to be displayed on the display 410 of the display device 400.
  • display in the standard state displaying only the design proposal image 30 without accompanying the sensitivity question item is referred to as "display in the standard state".
  • the display in the standard state is performed for a predetermined time (for example, 10 seconds).
  • the information regarding the line of sight of the user 10 with respect to the image of the evaluation object displayed on the display device 400 is acquired (step S105).
  • the line-of-sight measuring device 200 measures the line of sight of the user 10 with respect to the design proposal image 30 displayed on the display device 400, and transmits information about the line of sight to the line-of-sight information acquisition unit 358.
  • the line-of-sight measuring device 200 is calibrated according to the eyes of the user 10. In the calibration, the geometric features of the user 10's eyes are acquired in order to accurately calculate the user 10's line-of-sight position.
  • the line-of-sight measuring device 200 continuously measures the line-of-sight of the user 10 over the predetermined time, and during this period, the display control unit 356 displays the locus of the line-of-sight position of the user 10 on the display 410, that is, the distribution of the line-of-sight position at the predetermined time. Is displayed. As a result, the person performing the monitor test or the like can easily grasp the change over time in the line-of-sight position of the user 10 on the display surface of the display 410.
  • the data of the distribution of the line-of-sight position at a predetermined time is stored in the RAM 320.
  • the distribution of the line-of-sight position can be exemplified by a heat map.
  • a heat map the part where the line of sight is concentrated is painted in a warm color system (the part where the line of sight is dark in the figure), and the part where the line of sight is weakly concentrated is painted in a cool color system.
  • a concentrated zone dark part, the same applies hereinafter
  • B1 a concentrated zone
  • B2 a light colored part, the same applies hereinafter
  • the attraction degree measuring unit 354 measures the attraction degree for each area of the design proposal image 30 based on the information regarding the line of sight.
  • the measurement attraction is defined as, for example, the ratio of the residence time of the line-of-sight position in each area to the total residence time of each area.
  • the total residence time of each area is equal to the above-mentioned predetermined time. Therefore, the longer the line-of-sight position stays in the same area, the higher the degree of attraction for measurement in that area, and the shorter the position, the lower the degree of attraction for measurement.
  • the total degree of attraction for measurement is 1.0 for all areas.
  • FIG. 12 illustrates the degree of attraction for measurement in areas 1 to 4.
  • the measurement attraction degree in the area 2 is 0.82, which is higher than the measurement attraction degree in the other areas 1, the area 3, and the area 4. Therefore, it is considered that the area 2 attracts the attention of the user 10 more than the other areas 1, 3 and 4.
  • the line-of-sight position of the user 10 is in the area 2 at a time exceeding 80% of the predetermined time.
  • the difference between the measured attractiveness and the estimated attractiveness is calculated for each area (step S107).
  • the difference calculation unit 355 calculates the difference between the measured attractiveness and the estimated attractiveness (attracting difference) for each area.
  • FIG. 13 illustrates the difference in attractiveness in areas 1 to 4. As described above, it is considered that the measured attractiveness includes the attractiveness (that is, the estimated attractiveness) due to the prominence of the design proposal image 30. Therefore, by subtracting the estimated attractiveness from the measured attractiveness, the attractiveness based on the sensibility of the user 10 can be calculated.
  • the attractiveness difference is approximately 0 (zero) in each area of the design proposal image 30 (or where the attractiveness difference is approximately 0 in the design proposal image 30), the attraction due to the prominence of the design proposal image 30 Only the degree is included, and the degree of attraction based on the user 10's sensibilities is considered to be approximately 0 or extremely small. Further, when the attractiveness difference is a positive value (or the place where the attractiveness difference is a positive value in the design proposal image 30), the attractiveness based on the user's sensibilities is included in addition to the attractiveness caused by the prominence. It is thought that it has been done.
  • the attractiveness difference corresponds to the attractiveness based on the sensibility of the user 10, and the higher the attractiveness difference, the greater the contribution of the user 10's sensibility to the measured attractiveness.
  • a plurality of sensitivity question items 41 to 43 regarding the design proposal of the evaluation target are provided to the user 10 (step S108).
  • the expression providing unit 357 selects three words, for example, "innovative”, “delicious”, and “like” as the sensitivity question items.
  • the display control unit 356 controls, for example, to display these three sensitivity question items 41 to 43 on the display 410 one by one in order at predetermined time intervals.
  • the predetermined time interval is a time during which the user 10 can sufficiently evaluate the design proposal for each sensitivity question item, and can be set to, for example, 10 seconds in accordance with the "time limit" described later.
  • the display control unit 356 asks the user 10 for each sensitivity question item to be displayed, for example, "Enter a score with the keyboard (1 to 10 points). ) ”, A message requesting the evaluation of the design proposal is also displayed.
  • the monitor test is a test related to latent consciousness. Therefore, the user 10 evaluates the design proposal for each sensitivity question item in the shortest possible time, has three sensitivity question items, and switches to the next sensitivity question item at 10-second intervals, etc. in the monitor test. Get an explanation before the start.
  • the display control unit 356 may be configured to include the time limit in the message and display it on the display 410 in order to confirm the time limit to the user 10. Further, the sensitivity question items are not limited to three, and may be two or four or more.
  • the information regarding the line of sight of the user 10 with respect to the image of the evaluation object displayed on the display device 400 is acquired (step S109).
  • the line-of-sight measuring device 200 measures the line of sight of the user 10 with respect to the design proposal image 30 displayed on the display device 400, and transmits information about the line of sight to the line-of-sight information acquisition unit 358.
  • User 10 evaluates the design proposal according to his / her own sensibility for each sensibility question item. Specifically, the user 10 inputs the score of the design proposal by using the keyboard and the mouse of the input device 500.
  • the input of the score can be set so as to be performed within a predetermined time limit (for example, 10 seconds). For example, if the answer is not given within the time limit, the input of the score for the displayed question sensitivity item is completed, and if there is an undisplayed sensitivity question item, the answer is moved to the next sensitivity question item. Can be controlled to do so. This is to obtain the evaluation of the design proposal made according to the sensibility of the user 10 and to not put an undue burden on the user 10 (subject). It should be noted that the present invention is not limited to such a case, and if even one of the three sensitivity question items is not answered within the time limit, the monitor test may be forcibly terminated.
  • a predetermined time limit for example, 10 seconds.
  • a time limit is set for the answer so that the user 10 can intuitively evaluate the design proposal based on his / her own sensibility without evaluating the design proposal based on knowledge or logical thinking. Sensitive question items for which the answer of the user 10 exceeds the time limit are treated as no answer or invalid answer.
  • the measurement results obtained by redoing the test can be used for the purpose of supplementing the measurement results of valid answers.
  • the Kansei information acquisition unit 359 has a high possibility that the answer of the user 10 is a latent (according to Kansei) answer or a rational judgment based on the length of the answer time for each Kansei question item. Can be determined. That is, it is presumed that when the response time is short, the user 10 makes a potential judgment, and when the response time is long, the user 10 makes a rational judgment.
  • the temporal threshold for dividing potential / rational is not particularly limited, and can be appropriately set according to the complexity of the design of the evaluation object. For example, if the design is not complicated as in the design proposal of FIG. 6, the threshold value can be set to 4 to 5 seconds. On the other hand, when the design is complicated or the product description is included in the design, the threshold value is set to 9 to 10 seconds after extending the time limit in consideration of the time for the user 10 to read the description. It may be set to a degree.
  • the Kansei information acquisition unit 359 determines whether or not the evaluation of the design proposal by the user 10 has been completed for each Kansei question item (step S110). As a result of the determination, if the evaluation is not completed (step S110: NO), the process returns to the process of S109.
  • the measurement attraction degree in each area is calculated for each sensitivity question item (step S111).
  • the degree of attraction for measurement is calculated as the ratio of the residence time of the line-of-sight position in each area to the total residence time of each area, and the total residence time of each area here is the sensitivity question. Equal to the response time for the item. Therefore, the total residence time in each area may differ depending on the sensitivity question items 41 to 43.
  • FIG. 15 illustrates the degree of attraction for measurement in areas 1 to 4 for each sensitivity question item.
  • the degree of attraction for measurement when "innovative" is presented is 0.04, 0.83, 0.10, and 0.03, respectively, in areas 1 to 4.
  • the total of all areas of measurement attraction is 1.0.
  • the sensitivity information of the user 10 regarding the design of the evaluation target is calculated (step S112).
  • the Kansei information acquisition unit 359 acquires the score input by the user 10 when presenting each Kansei question item with respect to the design proposal.
  • FIG. 16 illustrates the points input by the user 10 regarding the design proposal when each sensitivity question item is presented.
  • the scores for presenting "innovative", “delicious”, and “like” are 6, 3, and 8, respectively.
  • the Kansei information acquisition unit 359 receives the above points as the evaluation of the user 10 for each Kansei question item, and calculates the Kansei information indicating the relationship between each Kansei question item and the user 10's sensitivity regarding the design proposal based on the evaluation. do.
  • the Kansei information acquisition unit 359 calculates Kansei information for each Kansei question item so as to be proportional to the evaluation of the user 10 for each Kansei question item. For example, as illustrated in FIG. 16, when the user 10's evaluation of the design proposal is 6 points, 3 points, and 8 points for "innovative", “delicious”, and “like", respectively. Set the ratio of the sensitivity information of each sensitivity question item to 6: 3: 8.
  • the attractiveness conversion value is calculated for each area (step S113).
  • the Kansei information acquisition unit 359 calculates an attractiveness conversion value for each area based on the measured attractiveness and the Kansei information in each area for each Kansei question item.
  • the attractiveness conversion value is represented by the product of the measured attractiveness and the sensitivity information in each area.
  • FIG. 17 illustrates the attractiveness conversion value of each area for each sensitivity question item.
  • the sensitivity output unit 362 outputs, for example, a graph showing the correlation between the attractiveness difference and the attractiveness conversion value to the display 410.
  • the sensibilities of the user 10 for the plurality of sensibility question items 41 to 43 are visualized with respect to the design proposal image 30.
  • the sensitivity extraction unit 361 has a relatively attractive degree difference of 0.3 in the area 2 for any of "innovative”, “delicious” and “like”. It is a large positive value, and in areas 1, 3 and 4, the relevance that the attractiveness difference is a negative value is extracted. Further, the sensibility extraction unit 361 determines that the degree of attraction based on the sensibility of the user 10 in the area 2 is high and the degree of attraction based on the sensibility of the user 10 in the areas 1, 3 and 4 is low based on this relationship.
  • the attractiveness difference reflects the attractiveness based on the sensibility of the user 10, but does not take into account the influence of the sensibility question item on the user 10.
  • the attraction degree conversion value is the product of the measurement attraction degree and the sensitivity information
  • the influence of the sensitivity question item on the user 10 is added to the sensitivity information. Therefore, in a graph of Cartesian coordinates with the attraction degree difference on the horizontal axis and the attraction degree conversion value on the vertical axis, the higher the correlation between the attraction degree difference and the attraction degree conversion value, the more the influence of the sensitivity question item on the user 10. It is judged to be strong.
  • the sensitivity extraction unit 361 extracts the relationship between the attraction degree difference and the plurality of sensitivity question items based on the attraction degree difference and the attraction degree conversion value. This makes it possible to determine which sensitivity question item causes the difference in the degree of attraction to be large.
  • the Kansei extraction unit 361 extracts the relationship between the attractiveness difference and a plurality of Kansei question items and user attributes in each area.
  • the attributes of the user 10 who is the subject are set to be in their twenties, have a female gender, and have a place of residence in a company in the city center. It is considered that there is a possibility that the area 2 attracts the eyes of the user 10 as compared with other areas due to such an attribute (background) of the user 10. Further, due to the attributes of the user 10, it is considered that the impression of "innovative" and “like” is good in the area 2, while the appeal of "delicious” is weak.
  • one design proposal image 30 of the evaluation target is acquired, and the estimated attraction degree is calculated based on the characteristics of the design proposal image 30. Further, the design proposal image 30 is displayed on the display device 400, and a plurality of sensitivity question items 41 to 43 regarding the design of the evaluation object are provided to the user 10. Subsequently, information regarding the line of sight of the user 10 with respect to the design proposal image 30 displayed on the display device 400 is acquired, and the measurement attraction degree of the image is calculated based on the information. Then, the attractiveness difference, which is the difference between the estimated attractiveness and the measured attractiveness, is calculated, and the relationship between the attractiveness difference and the plurality of Kansei question items is extracted.
  • a process of displaying a dummy image (not shown) and a dummy sensitivity question item on the display 410 and having the subject practice the contents of the monitor test is included. It may be configured as follows. As a result, the subject can face the actual monitor test while being accustomed to the contents of the monitor test. It should be noted that the measurement results of the practice using the dummy image are not added to the measurement results of the actual monitor test.
  • the Kansei information acquisition unit 359 acquires the evaluation (score) of the user 10 regarding the design proposal and calculates the Kansei information based on the evaluation of the user 10 has been described.
  • the calculation method is not limited to such cases.
  • the Kansei information acquisition unit 359 may be configured to acquire the answer time of the user 10 for the Kansei question item and calculate the Kansei information based on the answer time.
  • Sensitivity information can be calculated, for example, as the reciprocal of the response time.
  • the camera mounted on the design evaluation system 100 captures the facial expression and posture of the user 10, and the captured image is subjected to image processing such as pattern recognition to perform the sensitivity information of the user 10 for the sensitivity question item. Can also be estimated. As a result, it is possible to acquire Kansei information while suppressing the burden on the user 10 depending on the answer.
  • FIG. 19 is a diagram illustrating each variable when the relationship between the attractiveness difference and the plurality of Kansei question items 41 to 43 is extracted using the multiple regression analysis
  • FIG. 20 is a diagram illustrating the multiple regression analysis in FIG. It is a figure which illustrates the regression coefficient ⁇ 2 to ⁇ 3 calculated by.
  • FIGS. 18A to 18C the case where the correlation between the attractiveness difference and the attractiveness conversion value is displayed in a graph for the plurality of Kansei question items 41 to 43 is illustrated, but this embodiment is limited to such a case. Not done.
  • a statistical method eg, simple regression analysis or multiple regression analysis
  • the sensitivity question item (and the attribute of the user 10) is used as the explanatory variable, and the attractiveness difference is used as the objective variable, and the correlation between the explanatory variable and the objective variable is analyzed by statistical processing.
  • the attractiveness conversion value for each sensitivity question item is X ki
  • is an error term
  • the regression coefficient is ⁇ k
  • the attractiveness difference in each area is Y i .
  • ⁇ 1 , ⁇ 2 , ⁇ 3 , ..., ⁇ k represent the influence of each explanatory variable, and the influence on the attractiveness difference can be quantitatively expressed by the magnitude of this regression coefficient.
  • the differential formula be zero.
  • ⁇ 1 , ⁇ 2 , ..., ⁇ k can be uniquely obtained. Therefore, as shown in FIG. 20, by solving these simultaneous equations, it is possible to derive regression coefficients ⁇ 2 to ⁇ 4 indicating the degree of influence of the sensitivity question items.
  • Modification 2 A method of analyzing the relationship between an explanatory variable and an objective variable by machine learning, with the sensitivity question item (and the attribute of user 10) as the explanatory variable and the attractiveness difference as the objective variable (hereinafter referred to as "machine learning method”). Can also be adopted.
  • the sensitivity extraction unit 361 has a machine learning model in which supervised machine learning is performed using the sensitivity information (and information about the attributes of the user 10) as an explanatory variable and the attractiveness difference as an objective variable.
  • a machine learning model is used to extract the relationship between the attractiveness difference and multiple sensitivity question items.
  • FIG. 21 is a schematic block diagram illustrating a learning device 370 that generates a machine learning model based on the difference in attractiveness and the sensitivity information (and information on the attributes of the user 10).
  • the control device 300 functions as a learning device 370 by the CPU 310 executing a learning program.
  • the learning device 370 has a learning data storage unit 371 and a model generation unit 372.
  • the learning data storage unit 371 stores learning data (teacher data) including n sets of explanatory variables and objective variables acquired by performing a monitor test on n subjects.
  • the learning data may include, for example, sensitivity information as an explanatory variable and attractiveness difference as an objective variable. Further, information about the attribute of the user 10 may be further included as an explanatory variable.
  • the model generation unit 372 generates a machine learning model that machine-learns the relationship between the attractiveness difference and the sensitivity information (and the attribute of the user 10) based on the learning data stored in the learning data storage unit 371.
  • the generated machine learning model is transmitted to the Kansei extraction unit 361 of the control device 300, and can be used in extracting the relationship between the attractiveness difference and the plurality of Kansei question items 41 to 43.
  • the specified conditions are selected from the design database in which a large number of designs are stored in advance. It is possible to realize a system that extracts multiple design proposals that are close to each other. The designer in charge of the design proposal of the product package can utilize these multiple designs as a reference when creating a new design proposal or changing the design proposal. Furthermore, it is conceivable to create multiple patterns on the system side to change the design proposal and present it to the designer.
  • the control device 300 of the present embodiment described above has the following effects.
  • the control device 300 of the present embodiment calculates the degree of attraction based on the sensibility of the user 10 with respect to the design proposal image 30, and determines the relationship between the degree of attraction and the plurality of sensibility question items 41 to 43 regarding the design proposal image 30. Extract. Therefore, the user 10's sensibility regarding the design proposal image 30 can be visualized.
  • the designer who devised the design of the product package can determine the direction to be improved with respect to the sensitivity target required by the brand owner of the product based on the above-mentioned relationship extracted by the control device 300.
  • a preferable design proposal is decided at the planning stage of the new product from multiple product package design proposals, but the above-mentioned extracted relationships are designed by the product brand owner. It can also be useful when deciding on a plan. For example, in product package design, where is the place where the degree of attraction based on human sensibilities is high on the design proposal image for a certain sensibility question item, and how the degree of attraction changes in different sensibility question items. It will be easier to compare. Therefore, when using it for product marketing, if the sensitive points that you want to appeal to the purchasing target group are clear at the stage of preparing multiple design proposals, which one should be selected from multiple design proposals? It can be used as a material for judging whether it is good or not.
  • the design proposal is selected in consideration of the persona of the purchasing target group. You can get an index when you do.
  • FIG. 22 is a schematic view of the design evaluation system 100 according to the second embodiment as viewed from above, and FIGS. 23A and 23B are flowcharts illustrating a design evaluation method of the control device 300 according to the second embodiment. ..
  • the processing of the flowcharts of FIGS. 23A and 23B is realized by the CPU 310 executing the design evaluation program.
  • FIG. 24 is a schematic diagram illustrating the design proposal 1
  • FIGS. 25A and 25B are schematic views illustrating the area setting of the images of the design proposals 1 and 2, respectively
  • FIGS. 26A and 26B are design proposals, respectively. It is a schematic diagram which illustrates the saliency map of the image of 1 and 2.
  • FIGS. 27A and 27B are diagrams illustrating the estimated attractiveness in areas 1 to 4 of the images of the design proposals 1 and 2, and FIG. 28 illustrates the case where the images of the design proposals 1 and 2 are displayed in the standard state. It is a schematic diagram. Further, FIG. 29 is a schematic diagram illustrating the distribution of the line-of-sight position of the user who is viewing the images of the design proposals 1 and 2, and FIGS. 30 and 31 are measurements in areas 1 to 4 of the images of the design proposals 1 and 2. It is a schematic diagram which exemplifies each of the attractiveness degree and the attractiveness degree difference. Further, FIGS. 32A to 32C are schematic views illustrating the images of the design proposals 1 and 2 and the display of each sensitivity question item, and FIGS.
  • FIGS. 34A to 34C are the line-of-sight positions of the user when each sensitivity question item is displayed. It is a schematic diagram which illustrates the distribution of. Further, FIGS. 34A to 34C are schematic views illustrating the degree of attraction for measurement for each sensitivity question item, and FIG. 35 is a diagram illustrating the calculation result of the reaction intensity when a design proposal is selected for each sensitivity question item. be. Further, FIG. 36 is a graph illustrating the attractiveness conversion values in the areas 1 to 4 of the images of the design proposals 1 and 2 for each sensitivity question item, and FIGS. 37A to 37C show the attraction degree difference for each sensitivity question item. It is a graph exemplifying the correlation between and the attractiveness conversion value.
  • the user 10 simultaneously uses the images 31 and 32 of the two evaluation objects and the sensitivity question item (for example, “innovative”) 41 related to the design of the two evaluation objects.
  • the sensitivity question item for example, “innovative”
  • the user 10 selects a design proposal that seems to be suitable for the sensitivity question item from a plurality of design proposals of the evaluation target according to his / her own sensitivity.
  • an image of the evaluation target is acquired (step S201).
  • the design proposal (design proposal 2) used in the first embodiment the design proposal (design proposal 1) shown in FIG. 24 is used.
  • the design plan 1 is a design plan in which a concrete cooking example of "ingredients" is displayed by an illustration.
  • the image acquisition unit 351 acquires the design proposal 1 image 31 and the design proposal 2 image 32 (hereinafter, also referred to as “design proposal 1, 2 image 31, 32”).
  • the attribute information acquisition unit 352 reads, for example, information about the attributes of the user 10 from the auxiliary storage unit 340.
  • the degree of attraction is estimated for each area of the image of the evaluation target (step S203).
  • the attraction degree estimation unit 353 sets a plurality of areas (for example, areas 1 to 4) for each of the design proposals 1 and 2 images 31 and 32.
  • the attraction degree estimation unit 353 generates a saliency map for the design proposals 1 and 2 and the images 31 and 32, and stores the saliency map in the RAM 320.
  • FIG. 27A exemplifies the estimated degree of attraction in areas 1 to 4 of the design proposal 1 image 31.
  • FIG. 27B illustrates the estimated degree of attraction in areas 1 to 4 of the design proposal 2 image 32.
  • step S204 the image of the evaluation target is displayed (step S204).
  • the display control unit 356 causes the display 410 of the display device 400 to display the design proposals 1 and 2 images 31 and 32 side by side at predetermined intervals.
  • the information regarding the line of sight of the user 10 with respect to the image of the evaluation object displayed on the display device 400 is acquired (step S205).
  • the line-of-sight measuring device 200 measures the line-of-sight of the user 10 with respect to the design proposals 1 and 2 images 31 and 32 displayed on the display device 400, and transmits information about the line-of-sight to the line-of-sight information acquisition unit 358.
  • the distribution of line-of-sight positions can be exemplified by a heat map.
  • a concentration zone C1 having a slightly strong line of sight.
  • the design proposal 2 image 32 there are strong concentration zones C2 to C4 on the characters of the area 1 and at the upper part and the lower part of the “hourglass” illustration of the area 2, respectively.
  • the attraction degree measurement unit 354 calculates the measurement attraction degree for each area of the design proposals 1 and 2 images 31 and 32 based on the information regarding the line of sight.
  • FIG. 30 illustrates the degree of attraction for measurement in areas 1 to 4 of the design proposals 1 and 2 images 31 and 32.
  • the difference calculation unit 355 calculates the difference between the measured attraction degree and the estimated attraction degree (attraction degree difference) for each area of the design proposals 1 and 2 images 31 and 32.
  • FIG. 31 illustrates the difference in attractiveness in areas 1 to 4 of the design proposals 1 and 2 images 31 and 32.
  • a plurality of sensibility question items regarding the design proposal of the evaluation target are provided to the user (step S208).
  • the control device 300 provides the user 10 with three sensitivity question items 41 to 43 of "innovative", “delicious”, and “like” with respect to the design proposal 1 and the design proposal 2. Similar to the first embodiment, these Kansei question items are sequentially displayed one by one on the display 410 of the display device 400 at predetermined time intervals (for example, 10 seconds) (FIGS. 32A to 32C). See). In the example shown in FIG. 32A, the sensitivity question item "innovative" 41 is displayed at a position between the design proposal 1 image 31 and the design proposal 2 image 32.
  • the display control unit 356 When the display control unit 356 displays the sensitivity question items on the display 410, the display control unit 356 asks the user 10 about each sensitivity question item to be displayed, for example, "Design proposal 1 and design proposal matching the displayed words. A message asking you to select a design proposal is also displayed, such as "Please select from proposal 2". The selection of the proposed design can be set to be done within a predetermined time limit (eg, 10 seconds).
  • the position of the sensitivity question item is not limited to the intermediate display position.
  • the line-of-sight measuring device 200 measures the line-of-sight of the user 10 with respect to the design proposals 1 and 2 images 31 and 32 displayed on the display device 400, and transmits information about the line-of-sight to the line-of-sight information acquisition unit 358.
  • User 10 selects a design proposal according to his / her own sensibility for each sensibility question item. Specifically, the user 10 selects one of the design proposals 1 and the design proposal 2 by using the keyboard and the mouse of the input device 500 in order to determine the superiority or inferiority of these design proposals.
  • the input device 500 measures the response time from when each sensitivity question item is displayed until the user 10 selects the design proposal 1 or the design proposal 2 using the keyboard or the mouse. Further, the line-of-sight information acquisition unit 358 acquires information on the user's line of sight with respect to the design proposals 1 and 2 images 31 and 32 displayed on the display device 400, and obtains the movement history of the user's line of sight within the response time of the user 10. measure. The movement history is displayed on the display 410 of the display device 400 as a distribution of line-of-sight positions (see FIGS. 33A to 33C).
  • the heat map of the design proposal 1 image 31 is relatively strong in the part where the upper character of the illustration of the cooking example is written. There is a concentrated zone C5 of the line of sight and a weak concentrated zone C6 in the lower part where the characters are written.
  • the heat map of the design proposal 2 image 32 there is a weak concentration zone C7 at the upper part of the illustration of the “hourglass” and a strong concentration zone C8 at the lower part.
  • the Kansei information acquisition unit 359 determines whether or not the selection regarding the design proposal by the user 10 has been completed for each Kansei question item (step S210). As a result of the determination, if the selection is not completed (step S210: NO), the process returns to the process of S209.
  • step S210 when the selection is completed (step S210: YES), the degree of attraction is measured for each area (step S211).
  • the attraction degree measuring unit 354 calculates the measurement invitation degree of the design proposals 1, 2 images 31, 32 based on the information on the line of sight for each sensitivity question item.
  • FIGS. 34A to 34C exemplify the degree of attraction of measurement in areas 1 to 4 for "innovative”, “delicious”, and “like", respectively.
  • the Kansei information acquisition unit 359 calculates the Kansei information of the user 10 for each Kansei question item.
  • sensitivity information is calculated for each sensitivity question item so as to be proportional to the reciprocal of the response time (hereinafter, also referred to as “reaction intensity”). This is because the shorter the response time, the stronger the impression of the user 10's sensibility is by the sensibility question item, that is, it is considered that there is a strong connection between the sensibility question item and the sensibility of the user 10.
  • the sensitivity information is set to a value equal to the reaction intensity. As shown in FIG. 35, for example, when the response times for "innovative", “delicious”, and “like” were 3.5 seconds, 8.5 seconds, and 6.9 seconds, respectively, the reaction intensity. Are about 2.9, about 1.2, and about 1.4.
  • the attractiveness conversion value is calculated for each area (step S213).
  • the Kansei information acquisition unit 359 calculates the Kansei degree conversion value for each Kansei question item based on the measured Kansei degree and the Kansei information in each area.
  • the attractiveness conversion value is represented by the product of the measured attractiveness and the sensitivity information in each area.
  • the sensitivity output unit 362 outputs, for example, a graph showing the correlation between the attractiveness difference and the attractiveness conversion value to the display 410.
  • the sensibilities of the user 10 for the plurality of sensibility question items 41 to 43 are visualized with respect to the design proposal image 30.
  • the sensitivity extraction unit 361 has the area 3 of the design plan 1 image 31 and the design plan 2 for any of "innovative", “delicious” and “like”. The relationship with the area 2 of the image 32 that the attractiveness difference takes a relatively large positive value of 0.3 or more is extracted.
  • the sensitivity extraction unit 361 responds to any of "innovative”, “delicious”, and “like” in areas other than the area 3 of the design proposal 1 image 31 and the area 2 of the design proposal 2 image 32.
  • the relevance that the attractiveness difference is approximately 0 or less is extracted.
  • the attractiveness conversion value of the area 2 of the design proposal 2 image 32 is 3.77, respectively, when the sensitivity question items are “innovative”, “delicious”, and “like”. , 0.12, and 2.23.
  • the attractiveness conversion value is higher in the case of "innovative” and “like” than in the case of "delicious”. That is, when the sensitivity question items are "innovative” and "like", when the attractiveness difference is large, the attractiveness conversion value is also large, and the attractiveness difference and the attractiveness are compared with the case where "it looks delicious”. High correlation with converted value.
  • the sensitivity extraction unit 361 extracts the relationship between the attraction degree difference and the plurality of sensitivity question items based on the attraction degree difference and the attraction degree conversion value. This makes it possible to determine which sensitivity question item causes the difference in the degree of attraction to be large.
  • the present invention is not limited to such cases, and the images of the evaluation objects have the same design, one image is decoratively printed, and the other image is decoratively printed. It can be applied even if it does not exist.
  • the image acquisition unit 351 acquires an image of the evaluation object to which decorative printing is applied and an image of the evaluation object to which decorative printing is not applied.
  • the decorative printing is, for example, beat printing, varnishing printing, or the like.
  • the relationship between the attractiveness difference and the plurality of Kansei question items can be extracted by the statistical method or the machine learning method described in the first embodiment.
  • the design evaluation program may be provided by a computer-readable recording medium such as a USB memory, a flexible disk, or a CD-ROM, or may be provided online via a network such as the Internet.
  • the program recorded on the computer-readable recording medium is usually transferred to a memory, storage, or the like and stored.
  • this design evaluation program may be provided as a single application software, or may be incorporated into the software of each device as a function of the server.
  • a part or all of the processing executed by the program can be replaced with hardware such as a circuit and executed.
  • 100 design evaluation system 200 line-of-sight measuring device, 300 control unit, 310 CPU, 320 RAM, 330 ROM, 340 auxiliary storage, 351 Image acquisition department, 352 Attribute information acquisition department, 353 invitation degree estimation department, 354 attraction measurement unit, 355 Difference calculation unit, 356 display control unit, 357 Expression Providing Department, 358 Line-of-sight information acquisition department, 359 Kansei Information Acquisition Department, 360 attraction conversion value calculation unit, 361 Sensitivity Extractor, 362 Sensitive output unit, 370 learning device, 371 Learning data storage unit, 372 model generator, 400 display device, 500 input device, 600 audio output device, 700 communication equipment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

[Problem] To provide a design evaluating device, a design evaluating system, a design evaluating method, a design evaluating program, and a learning device with which a user's kansei (subjective impression) with respect to an image of an object being evaluated can be visualized. [Solution] This design evaluating device includes: an image acquiring unit; an attraction degree estimating unit for estimating a degree of attraction of an image on the basis of characteristics of the image; a display control unit; an expression providing unit for providing to the user a plurality of expressions relating to the design of the object being evaluated; a line of sight information acquiring unit for acquiring information relating to the line of sight of the user with respect to an image being displayed on the display device; an attraction degree measuring unit for measuring the degree of attraction of the image on the basis of the information relating to the line of sight; a difference calculating unit for calculating the difference between the estimated degree of attraction estimated by the attraction degree estimating unit and the measured degree of attraction measured by the attraction degree measuring unit; a kansei information acquiring unit for acquiring kansei information indicating a relationship between the expressions and the user's kansei relating to the design; and a kansei extracting unit for using the kansei information to extract a correlation between the difference and the plurality of expressions.

Description

デザイン評価装置、デザイン評価システム、デザイン評価方法、デザイン評価プログラム、および学習装置Design evaluation device, design evaluation system, design evaluation method, design evaluation program, and learning device
 本発明は、デザイン評価装置、デザイン評価システム、デザイン評価方法、デザイン評価プログラム、および学習装置に関する。 The present invention relates to a design evaluation device, a design evaluation system, a design evaluation method, a design evaluation program, and a learning device.
 商品のパッケージや広告等のデザインは、市場において消費者の関心を集める上で重要な要素になってきている。従来から、消費者の関心を集めることにより、商品の購入につなげることを目的とするイメージ戦略が販売者によって行われている。しかし、一般に、デザインに対する評価は、消費者個人の主観により大きく左右され、複数の個人間で同様な評価が得られるとは限らない。 The design of product packaging and advertisements has become an important factor in attracting consumer attention in the market. Traditionally, sellers have implemented image strategies aimed at attracting the attention of consumers and leading to the purchase of products. However, in general, the evaluation of design is greatly influenced by the subjectivity of individual consumers, and it is not always possible to obtain similar evaluations among a plurality of individuals.
 消費者の購買決定、考え、動機を理解することは、一般に難しいと言われている。消費者が自分の考えや行動を決定するにあたり、実際のところ、思考プロセスの約90~95%が潜在意識の中で起こるという研究結果もある(例えば、Kahneman,2013; Zaitman,2003)。このような背景の中で、自社商品のブランドを扱う商品マーケティングの実務家や、そのブランド自体にとって、そのブランドが求めている潜在意識を、いかにして消費者に定着させるかが重要であり、またその定着度合いを実際に理解することも重要である。 It is generally said that it is difficult to understand consumers' purchasing decisions, thoughts and motives. Studies have shown that in fact, about 90-95% of the thinking process occurs in the subconscious as consumers decide their thoughts and actions (eg, Kahneman, 2013; Zaitman, 2003). Against this background, it is important for product marketing practitioners who handle the brand of their own products and for the brand itself, how to establish the subconscious mind that the brand is looking for in consumers. It is also important to actually understand the degree of fixation.
 また、店頭の他に雑誌、テレビ、インターネット等様々な媒体を通じて商品を販売したり宣伝したりするにあたり、商品パッケージの意匠や広告のデザインは、商品の売上を左右する大きな要因である。近年、テレビやインターネット等の媒体を通じて、消費者が商品のパッケージや広告等のデザインを頼りに商品を購入するケースが増加している。販売戦略として、多くの消費者の関心を惹くデザインとすることが考えられるが、被験者による事前調査において個人間で評価にバラツキが生じるという課題がある。 In addition, when selling or promoting products through various media such as magazines, televisions, and the Internet in addition to stores, the design of product packages and the design of advertisements are major factors that influence the sales of products. In recent years, there have been an increasing number of cases in which consumers purchase products through media such as televisions and the Internet, relying on the designs of product packages and advertisements. As a sales strategy, it is conceivable to design the product to attract the attention of many consumers, but there is a problem that the evaluation varies among individuals in the preliminary survey by the subjects.
 これに対して、下記の特許文献1には、評価対象物の画像と、その画像についての感性評価を行うための項目とを被験者に提示し、被験者に評価を行わせるとともに、被験者が評価をしている間に、被験者の視線推移を測定する技術が開示されている。この技術では、被験者間の感性評価のバラツキに対して、被験者の視線情報を加味して、当該画像に対しての統一的な感性評価を算出することを目的としている。 On the other hand, in Patent Document 1 below, an image of an object to be evaluated and an item for evaluating the sensitivity of the image are presented to the subject, and the subject is made to evaluate the image, and the subject evaluates the image. While doing so, a technique for measuring the gaze transition of a subject is disclosed. The purpose of this technique is to calculate a unified sensibility evaluation for the image by adding the subject's line-of-sight information to the variation in the sensibility evaluation between the subjects.
 また、評価対象物の画像上における被験者の視線位置と、評価対象物の顕著性(人間の視覚的な注目しやすさを示し、「サリエンシー」とも呼ばれる)との相関を算出する技術が知られている。例えば、下記の特許文献2には、被験者の視線を計測する視線追跡装置3と、被験者の生理指標を計測する生理指標計測装置4と、被験者の生理指標から被験者の心理状態を推定し、被験者の視線と被験者の心理状態を対応付けて分析できるように開発された視線分析装置2とから少なくとも構成される視線分析システム1が開示されている。被験者の視線位置の集中度が、集中度の閾値以上で、かつ、顕著性の度合い(以下、「顕著度」という)が、顕著度の閾値以上の領域は、視線が向けられた頻度が高く、かつ、顕著度も高い領域になるため、被験者の印象に影響を与えている可能性のある関心領域とされる。 In addition, a technique for calculating the correlation between the subject's line-of-sight position on the image of the evaluation object and the prominence of the evaluation object (which indicates the ease of visual attention of humans and is also called "salience") is known. ing. For example, in Patent Document 2 below, a line-of-sight tracking device 3 for measuring a subject's line of sight, a physiological index measuring device 4 for measuring a subject's physiological index, and a subject's psychological state are estimated from the subject's physiological index. A line-of-sight analysis system 1 including at least a line-of-sight analysis device 2 developed so as to be able to analyze the line-of-sight of the subject in association with the psychological state of the subject is disclosed. In the region where the concentration of the subject's line-of-sight position is equal to or higher than the concentration threshold and the degree of saliency (hereinafter referred to as "severity") is equal to or higher than the saliency threshold, the frequency of the line of sight is high. In addition, since it is an area with a high degree of prominence, it is considered to be an area of interest that may affect the impression of the subject.
 しかし、画像内のオブジェクトの顕著度が高い場合、ユーザーが画像内のオブジェクトに関心がなくても視線を向けてしまい、オブジェクトに対するユーザーの関心の程度を示す関心度の推定に影響を与える可能性がある。これに関連して、オブジェクトに対する視線滞留時間が長いほど関心度が高くなるように、かつ顕著度が小さいほど関心度が高くなるように、オブジェクトに対するユーザーの関心度を推定する技術が知られている(例えば、下記特許文献3を参照)。 However, if the objects in the image are highly prominent, the user may look at the objects in the image even if they are not interested, which may affect the estimation of the degree of interest that indicates the degree of the user's interest in the objects. There is. In this regard, a technique for estimating the user's interest in an object is known so that the longer the line-of-sight residence time for the object is, the higher the interest is, and the smaller the salency is, the higher the interest is. (For example, see Patent Document 3 below).
特許3772514号公報Japanese Patent No. 3772514 特許6201520号公報Japanese Patent No. 6201520 特許5602155号公報Japanese Patent No. 5602155
 しかし、特許文献1の技術は、評価対象物の画像に対して、個人間における評価のバラツキを低減することを目的とするものである。したがって、販売者等が、消費者個人の主観を考慮して販売戦略を立案する場合、個人のどのような感性が評価対象物の画像に対して働いているかが分かりにくいという問題がある。 However, the technique of Patent Document 1 aims to reduce the variation in evaluation among individuals with respect to the image of the evaluation target. Therefore, when a seller or the like formulates a sales strategy in consideration of the subjectivity of an individual consumer, there is a problem that it is difficult to understand what kind of individual sensibility works on the image of the evaluation target.
 また、評価対象物の画像内における被験者の視線位置と評価対象物の顕著性とは必ずしも相関が高いとは限らない。したがって、特許文献2の技術によっても、個人のどのような感性が評価対象物の画像に対して働いているかは分からない、という問題がある。また、特許文献3の技術は、関心度の推定に対する誤差を低減することを目的としており、特許文献3の技術によっても、個人のどのような感性が評価対象物の画像に対して働いているかは分からない。 In addition, the line-of-sight position of the subject in the image of the evaluation object and the prominence of the evaluation object are not always highly correlated. Therefore, even with the technique of Patent Document 2, there is a problem that it is not possible to know what kind of individual sensibility works on the image of the evaluation object. Further, the technique of Patent Document 3 aims to reduce an error in estimating the degree of interest, and what kind of individual sensibilities work on the image of the evaluation object also by the technique of Patent Document 3. I don't know.
 本発明は、上述した課題に鑑みてなされたものである。したがって、本発明の目的は、評価対象物の画像に関するユーザーの感性を可視化できるデザイン評価装置、デザイン評価システム、デザイン評価方法、デザイン評価プログラム、および学習装置を提供することを目的とする。 The present invention has been made in view of the above-mentioned problems. Therefore, an object of the present invention is to provide a design evaluation device, a design evaluation system, a design evaluation method, a design evaluation program, and a learning device capable of visualizing a user's feelings regarding an image of an evaluation object.
 本発明の上記課題は、以下の手段によって解決される。 The above problem of the present invention is solved by the following means.
 (1)評価対象物の画像を少なくとも1つ取得する画像取得部と、前記画像の特徴に基づいて、前記画像の誘目度を推定する誘目度推定部と、前記画像を表示装置に表示させる表示制御部と、前記評価対象物のデザインに関する複数の表現をユーザーに提供する表現提供部と、前記表示装置に表示された前記画像に対する前記ユーザーの視線に関する情報を取得する視線情報取得部と、前記視線に関する情報に基づいて、前記画像の誘目度を測定する誘目度測定部と、前記誘目度推定部によって推定された推定誘目度と、前記誘目度測定部によって測定された測定誘目度との差分を算出する差分算出部と、前記表現と前記デザインに関する前記ユーザーの感性との関係性を示す感性情報を取得する感性情報取得部と、前記感性情報を用いて前記差分と前記複数の表現との関連性を抽出する感性抽出部と、デザイン評価装置。 (1) An image acquisition unit that acquires at least one image of an evaluation target, an attraction degree estimation unit that estimates the degree of attraction of the image based on the characteristics of the image, and a display for displaying the image on a display device. A control unit, an expression providing unit that provides a plurality of expressions related to the design of the evaluation object to the user, a line-of-sight information acquisition unit that acquires information about the user's line of sight with respect to the image displayed on the display device, and the above. Difference between the attractiveness measuring unit that measures the attractiveness of the image based on the information about the line of sight, the estimated attractiveness estimated by the attractiveness estimation unit, and the measured attractiveness measured by the attractiveness measuring unit. The difference calculation unit for calculating the difference, the sensitivity information acquisition unit for acquiring the sensitivity information indicating the relationship between the expression and the user's sensitivity regarding the design, and the difference and the plurality of expressions using the sensitivity information. Sensitivity extraction unit that extracts relevance and design evaluation device.
 (2)前記差分算出部は、前記画像に対して、互いに重ならない複数の領域を設定し、各々の領域において、前記差分を算出し、前記感性抽出部は、前記各々の領域において前記差分と前記複数の表現との前記関連性を抽出する、上記(1)に記載のデザイン評価装置。 (2) The difference calculation unit sets a plurality of regions that do not overlap each other with respect to the image, calculates the difference in each region, and the sensitivity extraction unit calculates the difference with the difference in each region. The design evaluation device according to (1) above, which extracts the relationship with the plurality of expressions.
 (3)前記各々の領域における前記測定誘目度と前記感性情報とに基づいて誘目度換算値を算出する誘目度換算値算出部をさらに有し、前記感性情報取得部は、前記デザインに関して、前記表現に対する前記ユーザーの回答を受け付け、当該回答に基づいて、前記感性情報を取得し、前記感性抽出部は、前記差分と前記誘目度換算値との相関関係に基づいて前記関連性を抽出する、上記(2)に記載のデザイン評価装置。 (3) Further, the attraction degree conversion value calculation unit for calculating the attraction degree conversion value based on the measurement attraction degree and the sensitivity information in each of the regions is further provided, and the sensitivity information acquisition unit has the above-mentioned design. The user's response to the expression is received, the sensitivity information is acquired based on the response, and the sensitivity extraction unit extracts the relationship based on the correlation between the difference and the attraction conversion value. The design evaluation device according to (2) above.
 (4)前記ユーザーの属性に関する情報を取得する属性情報取得部をさらに有し、前記感性抽出部は、前記各々の領域において、前記差分と前記複数の表現および前記ユーザーの属性に関する情報との関連性を抽出する、上記(3)に記載のデザイン評価装置。 (4) Further having an attribute information acquisition unit for acquiring information regarding the user's attributes, the sensitivity extraction unit relates the difference to the plurality of expressions and information regarding the user's attributes in each of the regions. The design evaluation device according to (3) above, which extracts sex.
 (5)前記感性抽出部は、前記複数の表現を説明変数とし、前記差分を目的変数として、統計学的手法により、前記関連性を抽出する、上記(2)に記載のデザイン評価装置。 (5) The design evaluation device according to (2) above, wherein the sensitivity extraction unit extracts the relationship by a statistical method using the plurality of expressions as explanatory variables and the difference as an objective variable.
 (6)前記ユーザーの属性に関する情報を取得する属性情報取得部をさらに有し、
 前記感性抽出部は、前記複数の表現および前記ユーザーの属性に関する情報を説明変数とし、前記差分を目的変数として、統計学的手法により、前記関連性を抽出する、上記(2)に記載のデザイン評価装置。
(6) Further has an attribute information acquisition unit for acquiring information related to the user's attributes.
The design according to (2) above, wherein the sensitivity extraction unit extracts the relationship by a statistical method using the information about the plurality of expressions and the attributes of the user as explanatory variables and the difference as the objective variable. Evaluation device.
 (7)前記ユーザーの属性に関する情報を取得する属性情報取得部をさらに有し、前記感性抽出部は、前記感性情報および前記ユーザーの属性に関する情報を説明変数とし、前記差分を目的変数として機械学習を行った機械学習モデルを使用して前記関連性を抽出する、上記(2)に記載のデザイン評価装置。 (7) Further having an attribute information acquisition unit for acquiring information on the user's attributes, the sensitivity extraction unit uses the sensitivity information and information on the user's attributes as explanatory variables, and machine learning using the difference as an objective variable. The design evaluation device according to (2) above, which extracts the relationship using the machine learning model obtained in the above.
 (8)前記感性抽出部は、複数の前記ユーザーについて、前記関連性を抽出する、上記(1)~(7)のいずれか1つに記載のデザイン評価装置。 (8) The design evaluation device according to any one of (1) to (7) above, wherein the sensitivity extraction unit extracts the relationship with a plurality of the users.
 (9)前記感性情報取得部は、前記デザインに関する前記ユーザーの評価を取得し、前記ユーザーの評価に基づいて、前記感性情報を算出する、上記(1)~(8)のいずれか1つに記載のデザイン評価装置。 (9) The Kansei information acquisition unit acquires the user's evaluation regarding the design, and calculates the Kansei information based on the user's evaluation, in any one of the above (1) to (8). Described design evaluation device.
 (10)前記感性情報取得部は、前記表現に対して、前記ユーザーによる回答が完了するまでにかかった回答時間を取得し、前記回答時間に基づいて、前記感性情報を算出する、上記(1)~(8)のいずれか1つに記載のデザイン評価装置。 (10) The Kansei information acquisition unit acquires the response time required for the user to complete the response to the expression, and calculates the Kansei information based on the response time (1). )-(8). The design evaluation device according to any one of (8).
 (11)前記視線情報取得部は、前記表現提供部によって前記ユーザーへ前記表現が提供されてから、前記感性情報取得部によって前記表現に対する前記ユーザーの回答を受け付けるまで、前記視線に関する情報を連続的に取得する、上記(1)~(10)のいずれか1つに記載のデザイン評価装置。 (11) The line-of-sight information acquisition unit continuously provides information on the line of sight from the time the expression is provided to the user by the expression providing unit until the user's response to the expression is received by the sensitivity information acquisition unit. The design evaluation device according to any one of (1) to (10) above.
 (12)前記表示制御部は、前記画像と、前記表現とを同時に前記表示装置に表示させる、上記(1)~(11)のいずれか1つに記載のデザイン評価装置。 (12) The design evaluation device according to any one of (1) to (11) above, wherein the display control unit simultaneously displays the image and the expression on the display device.
 (13)前記表現は、少なくとも言葉を含む、上記(1)~(12)のいずれか1つに記載のデザイン評価装置。 (13) The design evaluation device according to any one of (1) to (12) above, wherein the expression includes at least words.
 (14)前記画像取得部は、前記評価対象物の画像を複数取得し、前記誘目度推定部は、各々の前記画像の特徴に基づいて、各々の前記画像の誘目度を推定し、前記表示制御部は、各々の前記画像を前記表示装置に表示させ、前記誘目度測定部は、前記視線に関する情報に基づいて、各々の前記画像の誘目度を測定する、上記(1)~(13)のいずれか1つに記載のデザイン評価装置。 (14) The image acquisition unit acquires a plurality of images of the evaluation target, and the attraction degree estimation unit estimates the attraction degree of each image based on the characteristics of each image, and displays the display. The control unit displays each of the images on the display device, and the attraction degree measuring unit measures the degree of attraction of each of the images based on the information regarding the line of sight. The design evaluation device according to any one of the above.
 (15)前記画像取得部は、加飾印刷が施されている評価対象物の画像と、前記加飾印刷が施されていない前記評価対象物の画像と、を取得する、上記(14)に記載のデザイン評価装置。 (15) In the above (14), the image acquisition unit acquires an image of the evaluation object to which decorative printing is applied and an image of the evaluation object to which the decorative printing is not applied. Described design evaluation device.
 (16)前記表示装置と、前記表示装置に表示されている前記画像に対するユーザーの視線を測定し、前記視線に関する情報を前記視線情報取得部に出力する視線測定装置と、上記(1)~(15)のいずれか1つに記載のデザイン評価装置と、を有する、デザイン評価システム。 (16) The display device, a line-of-sight measuring device that measures the user's line of sight with respect to the image displayed on the display device, and outputs information about the line of sight to the line-of-sight information acquisition unit, and the above (1) to (1). A design evaluation system comprising the design evaluation device according to any one of 15).
 (17)評価対象物の画像を少なくとも1つ取得するステップ(a)と、前記画像の特徴に基づいて、各々の前記画像の誘目度を推定するステップ(b)と、前記画像を表示装置に表示させるステップ(c)と、前記評価対象物のデザインに関する複数の表現をユーザーに提供するステップ(d)と、前記表示装置に表示された前記画像に対する前記ユーザーの視線に関する情報を取得するステップ(e)と、前記視線に関する情報に基づいて、前記画像の誘目度を測定するステップ(f)と、前記ステップ(b)において推定された推定誘目度と、前記ステップ(f)において測定された測定誘目度との差分を算出するステップ(g)と、前記表現と前記デザインに関する前記ユーザーの感性との関係性を示す感性情報を取得するステップ(h)と、前記感性情報を用いて前記差分と前記複数の表現との関連性を抽出するステップ(i)と、を有する、デザイン評価方法。 (17) A step (a) of acquiring at least one image of an evaluation target, a step (b) of estimating the degree of attraction of each image based on the characteristics of the image, and the image being displayed on a display device. A step (c) for displaying, a step (d) for providing the user with a plurality of expressions relating to the design of the evaluation object, and a step for acquiring information regarding the user's line of sight with respect to the image displayed on the display device (d). e), the step (f) for measuring the degree of attraction of the image based on the information about the line of sight, the estimated degree of attraction estimated in the step (b), and the measurement measured in the step (f). The step (g) for calculating the difference from the degree of attraction, the step (h) for acquiring the sensitivity information indicating the relationship between the expression and the user's sensitivity regarding the design, and the difference with the difference using the sensitivity information. A design evaluation method comprising the step (i) of extracting the relationship with the plurality of expressions.
 (18)前記ステップ(g)では、前記画像に対して、互いに重ならない複数の領域を設定し、各々の領域において、前記差分を算出し、前記ステップ(i)では、前記各々の領域において前記差分と前記複数の表現との前記関連性を抽出する、上記(17)に記載のデザイン評価方法。 (18) In the step (g), a plurality of regions that do not overlap each other are set for the image, the difference is calculated in each region, and in the step (i), the region is described. The design evaluation method according to (17) above, which extracts the relationship between the difference and the plurality of expressions.
 (19)前記ステップ(i)の前に、前記ユーザーの属性に関する情報を取得するステップ(j)をさらに有し、前記ステップ(i)では、前記感性情報および前記ユーザーの属性に関する情報を説明変数とし、前記差分を目的変数として、前記関連性を抽出する、上記(17)または(18)に記載のデザイン評価方法。 (19) Prior to the step (i), there is further a step (j) for acquiring information regarding the user's attributes, and in the step (i), the sensitivity information and the information regarding the user's attributes are explanatory variables. The design evaluation method according to (17) or (18) above, wherein the difference is used as an objective variable to extract the relationship.
 (20)上記(17)~(19)のいずれか1つに記載のデザイン評価方法に含まれる処理をコンピューターに実行させるためのデザイン評価プログラム。 (20) A design evaluation program for causing a computer to execute the process included in the design evaluation method according to any one of (17) to (19) above.
 (21)説明変数としての感性情報と、目的変数としての誘目度差分を含む学習データを記憶する記憶部と、前記記憶部に記憶されている前記学習データに基づいて、前記感性情報と前記誘目度差分との関係を機械学習して、機械学習モデルを生成するモデル生成部と、を有する、学習装置。 (21) The sensitivity information and the attraction based on the storage unit that stores the learning data including the sensitivity information as the explanatory variable and the attraction degree difference as the objective variable, and the learning data stored in the storage unit. A learning device having a model generation unit that generates a machine learning model by machine learning the relationship with the degree difference.
 本発明によれば、評価対象物の画像に関して、ユーザーの感性に基づく誘目度を算出し、当該誘目度と、評価対象物の画像に関する複数の表現との関連性を抽出するので、評価対象物の画像に関するユーザーの感性を可視化できる。 According to the present invention, with respect to the image of the evaluation object, the degree of attraction based on the user's sensibilities is calculated, and the relationship between the degree of attraction and a plurality of expressions related to the image of the evaluation object is extracted. You can visualize the user's feelings about the image of.
第1の実施形態に係るデザイン評価システムを上方から見た模式図である。It is a schematic diagram which looked at the design evaluation system which concerns on 1st Embodiment from above. 図1に示すデザイン評価システムの概略的なハードウェア構成を例示するブロック図である。It is a block diagram exemplifying the schematic hardware configuration of the design evaluation system shown in FIG. 1. 図2に示す制御装置の主な機能を例示する機能ブロック図である。It is a functional block diagram illustrating the main function of the control device shown in FIG. 食品のパッケージのデザイン案に関する感性質問項目を例示する図である。It is a figure which exemplifies the sensitivity question item about the design proposal of a food package. 第1の実施形態に係る制御装置のデザイン評価方法を例示するフローチャートである。It is a flowchart which illustrates the design evaluation method of the control device which concerns on 1st Embodiment. 図5Aに後続するフローチャートである。It is a flowchart following FIG. 5A. 評価対象物のデザイン案を例示する模式図である。It is a schematic diagram which illustrates the design proposal of the evaluation object. デザイン案の画像に対して設定された複数のエリア1~4を例示する模式図である。It is a schematic diagram which illustrates the plurality of areas 1 to 4 set for the image of a design proposal. デザイン案の画像の顕著性マップを例示する図である。It is a figure which illustrates the saliency map of the image of a design proposal. デザイン案の画像のエリア1~4における推定誘目度を例示する図である。It is a figure which illustrates the estimated degree of attraction in the area 1 to 4 of the image of a design proposal. デザイン案の画像を標準状態で表示した場合を例示する模式図である。It is a schematic diagram which illustrates the case where the image of a design proposal is displayed in a standard state. デザイン案の画像を視ているユーザーの視線位置の分布を例示する模式図である。It is a schematic diagram which exemplifies the distribution of the line-of-sight position of a user who is looking at an image of a design proposal. デザイン案の画像のエリア1~4における測定誘目度を例示する図である。It is a figure which illustrates the measurement attraction degree in the area 1 to 4 of the image of a design proposal. デザイン案の画像のエリア1~4における誘目度差分を例示する図である。It is a figure which exemplifies the difference of the degree of attraction in the area 1 to 4 of the image of a design proposal. デザイン案の画像および感性質問項目(「斬新」)の表示を例示する模式図である。It is a schematic diagram exemplifying the display of the image of the design proposal and the display of the sensitivity question item (“innovative”). デザイン案の画像および感性質問項目(「おいしそう」)の表示を例示する模式図である。It is a schematic diagram exemplifying the display of the image of the design proposal and the display of the sensitivity question item (“delicious”). デザイン案の画像および感性質問項目(「好き」)の表示を例示する模式図である。It is a schematic diagram exemplifying the display of the image of the design proposal and the display of the sensitivity question item (“like”). 各感性質問項目について、デザイン案の画像のエリア1~4における測定誘目度を例示する図である。It is a figure which illustrates the measurement attraction degree in the area 1 to 4 of the image of a design proposal for each sensitivity question item. 各感性質問項目について、ユーザーによってデザイン案に関して入力された点数を例示する図である。It is a figure which exemplifies the score input about the design proposal by the user for each sensitivity question item. 各感性質問項目について、デザイン案の画像のエリア1~4における誘目度換算値を例示する図である。It is a figure which exemplifies the attractiveness conversion value in the area 1 to 4 of the image of a design proposal for each sensitivity question item. 感性質問項目(「斬新」)について、誘目度差分と誘目度換算値との相関関係を例示するグラフである。It is a graph exemplifying the correlation between the attractiveness difference and the attractiveness conversion value for the sensitivity question item (“innovative”). 感性質問項目(「おいしそう」)について、誘目度差分と誘目度換算値との相関関係を例示するグラフである。It is a graph exemplifying the correlation between the attractiveness difference and the attractiveness conversion value for the sensitivity question item (“delicious”). 感性質問項目(「好き」)について、誘目度差分と誘目度換算値との相関関係を例示するグラフである。It is a graph exemplifying the correlation between the attractiveness difference and the attractiveness conversion value for the sensitivity question item (“like”). 誘目度差分と複数の感性質問項目との関連性を重回帰分析を使用して抽出する場合の各変数を例示する図である。It is a figure which exemplifies each variable in the case of extracting the relationship between the difference in attractiveness degree and a plurality of Kansei question items by using multiple regression analysis. 図19における重回帰分析によって算出された回帰係数β~βを例示する図である。It is a figure which illustrates the regression coefficient β 2 to β 3 calculated by the multiple regression analysis in FIG. 誘目度差分と、感性情報(およびユーザーの属性)とに基づいて、機械学習モデルを生成する学習装置を例示する概略ブロック図であるIt is a schematic block diagram illustrating a learning device that generates a machine learning model based on the difference in attractiveness and the sensitivity information (and the attribute of the user). 第2の実施形態に係るデザイン評価システムを上方から見た模式図である。It is a schematic diagram which looked at the design evaluation system which concerns on 2nd Embodiment from above. 第2の実施形態に係る制御装置のデザイン評価方法を例示するフローチャートである。It is a flowchart which illustrates the design evaluation method of the control device which concerns on 2nd Embodiment. 図23Aに後続するフローチャートである。It is a flowchart following FIG. 23A. 評価対象物のデザイン案1を例示する模式図である。It is a schematic diagram which illustrates the design proposal 1 of the evaluation object. デザイン案1の画像のエリア設定を例示する模式図である。It is a schematic diagram which illustrates the area setting of the image of the design proposal 1. デザイン案2の画像のエリア設定を例示する模式図である。It is a schematic diagram which illustrates the area setting of the image of the design proposal 2. デザイン案1の画像の顕著性マップを例示する模式図である。It is a schematic diagram which illustrates the saliency map of the image of the design proposal 1. デザイン案2の画像の顕著性マップを例示する模式図である。It is a schematic diagram which illustrates the saliency map of the image of the design proposal 2. デザイン案1の画像のエリア1~4における推定誘目度を例示する図である。It is a figure which illustrates the estimated degree of attraction in the area 1 to 4 of the image of the design proposal 1. FIG. デザイン案2の画像のエリア1~4における推定誘目度を例示する図である。It is a figure which illustrates the estimated degree of attraction in the area 1 to 4 of the image of the design proposal 2. デザイン案1,2の画像を標準状態で表示した場合を例示する模式図である。It is a schematic diagram which illustrates the case where the images of design proposals 1 and 2 are displayed in a standard state. デザイン案1,2の画像を視ているユーザーの視線位置の分布を例示する模式図である。It is a schematic diagram which illustrates the distribution of the line-of-sight position of the user who is looking at the image of design proposals 1 and 2. デザイン案1,2の画像のエリア1~4における測定誘目度を例示する模式図である。It is a schematic diagram which illustrates the measurement attractiveness degree in the area 1 to 4 of the image of the design proposals 1 and 2. デザイン案1,2の画像のエリア1~4における誘目度差分を例示する模式図である。It is a schematic diagram which illustrates the difference of attractiveness degree in the area 1 to 4 of the image of the design proposals 1 and 2. デザイン案1,2の画像、および感性質問項目(「斬新」)の表示を例示する模式図である。It is a schematic diagram exemplifying the display of the image of the design proposals 1 and 2 and the sensitivity question item (“innovative”). デザイン案1,2の画像、および感性質問項目(「おいしそう」)の表示を例示する模式図である。It is a schematic diagram which exemplifies the display of the image of the design proposals 1 and 2 and the sensitivity question item (“delicious”). デザイン案1,2の画像、および感性質問項目(「好き」)の表示を例示する模式図である。It is a schematic diagram exemplifying the display of the image of the design proposals 1 and 2 and the sensitivity question item (“like”). 感性質問項目(「斬新」)を表示した場合のユーザーの視線位置の分布を例示する模式図である。It is a schematic diagram which exemplifies the distribution of the line-of-sight position of a user when the sensitivity question item (“innovative”) is displayed. 感性質問項目(「おいしそう」)を表示した場合のユーザーの視線位置の分布を例示する模式図である。It is a schematic diagram exemplifying the distribution of the line-of-sight position of the user when the sensitivity question item (“delicious”) is displayed. 感性質問項目(「好き」)を表示した場合のユーザーの視線位置の分布を例示する模式図である。It is a schematic diagram exemplifying the distribution of the line-of-sight position of the user when the sensitivity question item (“like”) is displayed. 感性質問項目(「斬新」)に対する測定誘目度を例示する模式図である。It is a schematic diagram which exemplifies the measurement attraction degree for a sensitivity question item (“innovative”). 感性質問項目(「おいしそう」)に対する測定誘目度を例示する模式図である。It is a schematic diagram which exemplifies the degree of attraction of measurement for a sensitivity question item (“delicious”). 感性質問項目(「好き」)に対する測定誘目度を例示する模式図である。It is a schematic diagram which exemplifies the degree of attraction of measurement for a sensitivity question item (“like”). 各感性質問項目について、デザイン案を選択したときの反応強度の算出結果を例示する図である。It is a figure which exemplifies the calculation result of the reaction intensity at the time of selecting a design proposal for each sensitivity question item. 各感性質問項目について、デザイン案1,2の画像のエリア1~4における誘目度換算値を例示するグラフである。It is a graph exemplifying the degree of attraction conversion value in the area 1 to 4 of the image of the design proposals 1 and 2 for each sensitivity question item. 感性質問項目(「斬新」)について、誘目度差分と誘目度換算値との相関関係を例示するグラフである。It is a graph exemplifying the correlation between the attractiveness difference and the attractiveness conversion value for the sensitivity question item (“innovative”). 感性質問項目(「おいしそう」)について、誘目度差分と誘目度換算値との相関関係を例示するグラフである。It is a graph exemplifying the correlation between the attractiveness difference and the attractiveness conversion value for the sensitivity question item (“delicious”). 感性質問項目(「好き」)について、誘目度差分と誘目度換算値との相関関係を例示するグラフである。It is a graph exemplifying the correlation between the attractiveness difference and the attractiveness conversion value for the sensitivity question item (“like”).
 以下、図面を参照して、本発明の実施形態について説明する。なお、図面において、同一の要素には同一の符号を付し、重複する説明を省略する。また、図面の寸法比率は、説明の都合上誇張されており、実際の比率とは異なる場合がある。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the drawings, the same elements are designated by the same reference numerals, and duplicate description will be omitted. In addition, the dimensional ratios in the drawings are exaggerated for convenience of explanation and may differ from the actual ratios.
 (第1の実施形態)
 図1は、第1の実施形態に係るデザイン評価システム100を上方から見た模式図であり、図2は図1に示すデザイン評価システム100の概略的なハードウェア構成を例示するブロック図である。また、図3は、図2に示す制御装置300の主な機能を例示する機能ブロック図である。また、図4は、食品のパッケージのデザイン案に関する感性質問項目を例示する図である。
(First Embodiment)
FIG. 1 is a schematic view of the design evaluation system 100 according to the first embodiment as viewed from above, and FIG. 2 is a block diagram illustrating a schematic hardware configuration of the design evaluation system 100 shown in FIG. .. Further, FIG. 3 is a functional block diagram illustrating the main functions of the control device 300 shown in FIG. Further, FIG. 4 is a diagram illustrating the sensitivity question items regarding the design proposal of the food package.
 本実施形態のデザイン評価システム100は、評価対象物の画像30と、評価対象物のデザインに関する言葉40とを提示し、画像30に関して、ユーザー10の感性に基づく誘目性(視覚的な注意の引きやすさ)と、複数の言葉40との関連性を抽出する。これにより、評価対象物のデザインに関して、これらの言葉40のうちユーザー10の感性と調和する言葉40が判定される。評価対象物は、例えば、デザイナーによってデザインされた商品、商品パッケージ、広告等でありうる。 The design evaluation system 100 of the present embodiment presents an image 30 of the evaluation object and words 40 relating to the design of the evaluation object, and with respect to the image 30, an attractiveness (visual attention) based on the user 10's sensibilities. Ease) and the relationship between the plurality of words 40 are extracted. As a result, among these words 40, the words 40 that are in harmony with the sensibility of the user 10 are determined with respect to the design of the evaluation object. The evaluation object may be, for example, a product designed by a designer, a product package, an advertisement, or the like.
 図1,図2に示すように、デザイン評価システム100は、視線測定装置200、制御装置300、表示装置400、入力装置500、音声出力装置600、および通信装置700を有する。これらの構成は、互いに通信可能に接続されている。 As shown in FIGS. 1 and 2, the design evaluation system 100 includes a line-of-sight measuring device 200, a control device 300, a display device 400, an input device 500, an audio output device 600, and a communication device 700. These configurations are communicably connected to each other.
 <視線測定装置200>
 視線測定装置200は、ユーザー10の視線20を連続的に測定し、視線に関する情報を制御装置300に送信する。後述するように、制御装置300は、デザイン評価装置として機能し、表示装置400のディスプレイ410に評価対象物の画像30を表示させ、ユーザー10に提示する。視線に関する情報は、例えば、評価対象物の画像30上においてユーザー10が視ている位置(以下、「視線位置」という)50、ユーザー10の左右の眼の視線方向、またはユーザー10の左右の眼の瞳の位置等でありうる。
<Gaze measuring device 200>
The line-of-sight measuring device 200 continuously measures the line-of-sight 20 of the user 10 and transmits information about the line-of-sight to the control device 300. As will be described later, the control device 300 functions as a design evaluation device, causes the display 410 of the display device 400 to display the image 30 of the evaluation target, and presents the image 30 to the user 10. The information regarding the line of sight is, for example, the position (hereinafter referred to as “line-of-sight position”) 50 that the user 10 is looking at on the image 30 of the evaluation object, the line-of-sight direction of the left and right eyes of the user 10, or the left and right eyes of the user 10. It can be the position of the pupil of.
 視線測定装置200は、例えば、ユーザー10の頭部に装着されうる。視線測定装置200による視線測定には、公知のアイトラッキング技術を採用できる。アイトラッキング技術は、角膜反射法、強膜反射法等の非接触型の技術でありうる。また、サーチコイル法、眼球電位法等の接触型の技術や、これら以外の視線測定法を使用してもよい。 The line-of-sight measuring device 200 can be attached to the head of the user 10, for example. A known eye tracking technique can be adopted for the line-of-sight measurement by the line-of-sight measuring device 200. The eye tracking technique may be a non-contact technique such as a corneal reflex method or a scleral reflex method. Further, a contact-type technique such as a search coil method or an eye potential method, or a line-of-sight measurement method other than these may be used.
 角膜反射法を使用して視線を測定する場合、視線測定装置200は、例えば、ユーザー10の左右の眼の各々に弱い近赤外光を照射し、ユーザー10の左右の眼をカメラにより撮像する。そして、視線測定装置200は、カメラの映像から、ユーザー10の左右の眼の各々について角膜表面における近赤外線光の反射光の中心位置と、眼の瞳孔の中心位置とを算出し、これらに基づいて、ユーザー10の視線20を検出する。これにより、ユーザー10が評価対象物の画像を注視している間に、ユーザー10の視線20が精度よく測定される。視線測定装置200は、3次元空間上におけるユーザー10の視線方向、ユーザー10の左右の眼の瞳孔の中心位置に関する情報等に基づいて、評価対象物の画像30におけるユーザー10の視線位置を算出する。なお、視線に関する情報として、ユーザー10の左右の眼の視線方向、ユーザー10の左右の眼の瞳孔の中心位置等を、視線測定装置200から制御装置300に送信し、制御装置300において、視線に関する情報に基づいて視線位置50を算出するように構成してもよい。 When measuring the line of sight using the corneal reflex method, the line-of-sight measuring device 200 irradiates each of the left and right eyes of the user 10 with weak near-infrared light, and images the left and right eyes of the user 10 with a camera, for example. .. Then, the line-of-sight measuring device 200 calculates the center position of the reflected light of the near-infrared light on the corneal surface and the center position of the pupil of the eye for each of the left and right eyes of the user 10 from the image of the camera, and based on these. The line of sight 20 of the user 10 is detected. As a result, the line of sight 20 of the user 10 is accurately measured while the user 10 is gazing at the image of the evaluation object. The line-of-sight measuring device 200 calculates the line-of-sight position of the user 10 in the image 30 of the evaluation object based on the information on the line-of-sight direction of the user 10 in the three-dimensional space, the center positions of the pupils of the left and right eyes of the user 10, and the like. .. As information on the line of sight, the line-of-sight directions of the left and right eyes of the user 10, the center positions of the pupils of the left and right eyes of the user 10, and the like are transmitted from the line-of-sight measuring device 200 to the control device 300, and the control device 300 relates to the line of sight. It may be configured to calculate the line-of-sight position 50 based on the information.
 <制御装置300>
 制御装置300は、デザイン評価装置として機能し、デザイン評価システム100を統合的に制御する。制御装置300は、後述するデザイン評価プログラムがインストールされたパーソナルコンピューター、スマートフォン、PDA(Personal Digital Assistant)、タブレット端末の本体部(制御装置)等でありうる。図2に示すように、制御装置300は、CPU(Central Processing Unit)310、RAM(Random Access Memory)320、ROM(Read Only Memory)330、補助記憶部340等を備える。
<Control device 300>
The control device 300 functions as a design evaluation device and controls the design evaluation system 100 in an integrated manner. The control device 300 may be a personal computer, a smartphone, a PDA (Personal Digital Assistant) in which a design evaluation program described later is installed, a main body of a tablet terminal (control device), or the like. As shown in FIG. 2, the control device 300 includes a CPU (Central Processing Unit) 310, a RAM (Random Access Memory) 320, a ROM (Read Only Memory) 330, an auxiliary storage unit 340, and the like.
 CPU310は、RAM320に展開されたOS(Operating System)やデザイン評価プログラムを実行し、視線測定装置200、表示装置400、入力装置500、音声出力装置600、および通信装置700の動作制御を行う。デザイン評価プログラムは、ROM330または補助記憶部340に予め保存されている。また、RAM320は、CPU310の処理によって一時的に生じたデータ等を格納する。ROM330は、CPU310によって実行されるプログラムや、プログラムの実行に使用されるデータ、パラメーター等を記憶する。 The CPU 310 executes an OS (Operating System) and a design evaluation program deployed in the RAM 320, and controls the operation of the line-of-sight measuring device 200, the display device 400, the input device 500, the voice output device 600, and the communication device 700. The design evaluation program is stored in the ROM 330 or the auxiliary storage unit 340 in advance. Further, the RAM 320 stores data or the like temporarily generated by the processing of the CPU 310. The ROM 330 stores a program executed by the CPU 310, data used for executing the program, parameters, and the like.
 補助記憶部340は、例えばHDD(Hard Disk Drive)、SSD(Solid State Drive)等を有し、評価対象物の画像を複数保存しうる。これらの画像は、評価対象物がカメラやスキャナー等によって画像化され、例えば、JPEG(Joint Photographic Experts Group)形式、TIFF(tag image file format)形式、PNG(Portable Network Graphics)等の画像フォーマットで保存されている。また、補助記憶部340には、ユーザー10に提示される複数の言葉が保存されている。これらの言葉の詳細については後述する。また、補助記憶部340は、後述するモニター試験を行う際にユーザー10の属性に関する情報についても記憶する。 The auxiliary storage unit 340 has, for example, an HDD (Hard Disk Drive), an SSD (Solid State Drive), or the like, and can store a plurality of images of an evaluation target. In these images, the evaluation target is imaged by a camera, a scanner, or the like, and for example, JPEG (Joint Photographic Experts Group) format, TIFF (tag image file format) format, PNG (Portable Network image storage), etc. Has been done. Further, the auxiliary storage unit 340 stores a plurality of words presented to the user 10. Details of these terms will be described later. In addition, the auxiliary storage unit 340 also stores information regarding the attributes of the user 10 when performing a monitor test described later.
 <表示装置400>
 表示装置400は、ディスプレイ410を備え、評価対象物の画像と、この評価対象物のデザイン案に関する言葉とをユーザー10に提示するために使用される。また、ユーザー10が、評価対象物のデザイン案に関する評価を行った後、後述する誘目度差分と誘目度換算値との相関関係のグラフ(例えば、図18A~図18C)や、誘目度差分と複数の感性質問項目との関連性の抽出結果(例えば、図20)等がディスプレイ410に表示される。
<Display device 400>
The display device 400 includes a display 410, and is used to present an image of the evaluation object and words related to the design proposal of the evaluation object to the user 10. Further, after the user 10 evaluates the design proposal of the evaluation target object, the graph of the correlation between the attractiveness difference and the attractiveness conversion value (for example, FIGS. 18A to 18C), which will be described later, and the attractiveness difference The extraction result (for example, FIG. 20) of the relationship with the plurality of sensitivity question items is displayed on the display 410.
 <入力装置500>
 入力装置500は、例えば、キーボード、マウス等を備え、キーボード、マウス等による文字入力、各種設定等の各種指示(入力)をユーザー10が行うために利用される。本実施形態では、モニター試験におけるユーザー10の属性に関する情報(後述)の入力や、ユーザー10が評価対象物のデザイン案に関して評価する際の入力等に使用される。また、入力装置500は、計時時能を有し、評価対象物の画像と、この評価対象物のデザイン案に関する言葉とが表示装置400のディスプレイに表示されてから、ユーザー10が評価対象物のデザイン案について評価を完了するまでにかかった時間(以下、「回答時間」という)を計測する。
<Input device 500>
The input device 500 includes, for example, a keyboard, a mouse, and the like, and is used for the user 10 to perform various instructions (inputs) such as character input by the keyboard, the mouse, and various settings. In the present embodiment, it is used for inputting information (described later) regarding the attributes of the user 10 in the monitor test, input when the user 10 evaluates the design proposal of the evaluation object, and the like. Further, the input device 500 has a timekeeping ability, and after the image of the evaluation object and the words related to the design proposal of the evaluation object are displayed on the display of the display device 400, the user 10 uses the evaluation object. The time taken to complete the evaluation of the design proposal (hereinafter referred to as "response time") is measured.
 <音声出力装置600>
 音声出力装置600は、スピーカーを有し、評価対象物のデザイン案に関する言葉を音声でユーザー10に提供することができる。
<Audio output device 600>
The voice output device 600 has a speaker, and can provide the user 10 with words related to the design proposal of the evaluation object by voice.
 <通信装置700>
 通信装置700は、例えばネットワーク・インターフェースカード(NIC:Network Interface Card)等の通信回路を有し、図示しない通信ネットワークを通じて外部機器との間でデータ伝送を行う。
<Communication device 700>
The communication device 700 has, for example, a communication circuit such as a network interface card (NIC: Network Interface Card), and transmits data to and from an external device through a communication network (not shown).
 図3に示すように、制御装置300は、CPU310がデザイン評価プログラムを実行することで、画像取得部351、属性情報取得部352、誘目度推定部353、誘目度測定部354、差分算出部355、表示制御部356、表現提供部357、視線情報取得部358、感性情報取得部359、誘目度換算値算出部360、感性抽出部361、および感性出力部362として機能する。 As shown in FIG. 3, in the control device 300, the CPU 310 executes the design evaluation program, so that the image acquisition unit 351 and the attribute information acquisition unit 352, the attraction degree estimation unit 353, the attraction degree measurement unit 354, and the difference calculation unit 355 are executed. , Display control unit 356, expression providing unit 357, line-of-sight information acquisition unit 358, sensitivity information acquisition unit 359, attraction conversion value calculation unit 360, sensitivity extraction unit 361, and sensitivity output unit 362.
 画像取得部351は、例えば、補助記憶部340に保存されている評価対象物の画像を少なくとも1つ取得する。また、画像取得部351は、通信装置700を介して、外部のサーバー等から評価対象物の画像を取得することもできる。 The image acquisition unit 351 acquires at least one image of the evaluation target object stored in the auxiliary storage unit 340, for example. Further, the image acquisition unit 351 can also acquire an image of the evaluation target object from an external server or the like via the communication device 700.
 属性情報取得部352は、ユーザー10の属性やペルソナ等(以下、「ユーザー10の属性に関する情報」という)を取得する。ユーザー10の属性は、例えば、人口統計学的属性、心理学的属性、および行動学的属性の3つに分類されうる。 The attribute information acquisition unit 352 acquires the attributes and personas of the user 10 (hereinafter referred to as "information regarding the attributes of the user 10"). The attributes of the user 10 can be classified into, for example, demographic attributes, psychological attributes, and behavioral attributes.
 人口統計学的属性には、例えば、年齢(年代)、性別、居所(居住地、勤務地)等の属性が含まれる。心理学的属性には、例えば、価値観、消費性向等の属性等が含まれる。行動学的属性には、例えば、商品購入履歴、行動範囲等の属性が含まれる。 Demographic attributes include, for example, attributes such as age (age), gender, and place of residence (place of residence, place of work). Psychological attributes include, for example, attributes such as values and propensity to consume. The behavioral attributes include, for example, attributes such as product purchase history and action range.
 例えば、属性情報取得部352は、質問事項を含むアンケート等を表示装置400に表示させ、ユーザー10にアンケートの回答を促す。アンケートには、例えば、人口統計学的属性としてユーザー10の年齢(年代)、性別、心理学的属性としてユーザー10の消費行動等の質問事項が含まれうる。消費行動に関する質問は、例えば、他人が購入するものをよく購入するか、ブランドものをよく購入するか等が挙げられる。 For example, the attribute information acquisition unit 352 displays a questionnaire or the like including a question on the display device 400, and prompts the user 10 to answer the questionnaire. The questionnaire may include, for example, questions such as the age (age) of the user 10 as a demographic attribute, gender, and the consumption behavior of the user 10 as a psychological attribute. Questions about consumer behavior include, for example, whether they often buy things that others buy or branded things.
 また、ユーザー10に対して、最近何を購入したか、商品の購入方法でよく使用する方法等をアンケートで質問したり、ユーザー10のウェブの閲覧履歴等を入手したりすることにより、行動学的属性を取得することができる。 In addition, by asking the user 10 what he / she recently purchased, the method often used for purchasing the product, etc. in a questionnaire, and obtaining the user 10's web browsing history, etc., ethology You can get the target attribute.
 さらに、事前に収集した画像情報等に基づいて、ユーザー10の行動(動作)を解析することにより属性を取得することも可能である。例えば、ユーザー10の顔の画像から年齢を推定したり、ユーザー10の行動や視線の動き等から性格等を判断したりすることも可能である。例えば、店内にあるカメラにより、陳列棚の前にいるユーザー10の行動を撮影し、ユーザー10が購入前に商品のパッケージの記載を見る時間が長い場合、慎重ないしは論理的にものごとを考える性格であると判定できる。また、アンケートとユーザー10の行動の解析とを併用することにより、属性の判定(取得)精度を高めることが可能である。 Furthermore, it is also possible to acquire attributes by analyzing the behavior (behavior) of the user 10 based on the image information or the like collected in advance. For example, it is possible to estimate the age from the image of the face of the user 10 or to judge the personality or the like from the behavior of the user 10 or the movement of the line of sight. For example, if the camera in the store captures the behavior of the user 10 in front of the display shelf and the user 10 spends a lot of time looking at the description of the product package before purchasing, he or she thinks carefully or logically. Can be determined to be. Further, by using the questionnaire and the analysis of the behavior of the user 10 together, it is possible to improve the accuracy of determining (acquiring) the attribute.
 また、ペルソナは、例えば、職業・役職・年収等、趣味・価値観・ライフスタイル等に関する、商品やサービスを購買する架空の消費者像である。ユーザー10の属性に関する情報は、例えば、補助記憶部340に記憶されている。 The persona is, for example, a fictitious consumer image of purchasing products and services related to occupations, job titles, annual income, hobbies, values, lifestyles, etc. Information about the attributes of the user 10 is stored in, for example, the auxiliary storage unit 340.
 誘目度推定部353は、評価対象物の画像について顕著性の度合いを推定(シミュレーション)する。本実施形態において、画像の特徴による、人間の視覚的な注目しやすさを「顕著性」と呼び、顕著性の度合いを数値で表した値を「顕著度」と呼ぶ。例えば、人間が評価対象物を見た際に人間の網膜が強く反応する部分は顕著度が高い。誘目度推定部353は、評価対象物の画像の全体に対して、顕著度の分布を示す顕著性マップを生成する。顕著性マップがカラー表示される場合、顕著度が高い部分は、例えば、赤色等の暖色系の色で表示され、顕著度が低い部分は、寒色系で表示される。青色は、顕著度が最低値であるグランド・レベルに対応する。 The attraction degree estimation unit 353 estimates (simulates) the degree of prominence of the image of the evaluation target. In the present embodiment, the ease of visual attention of a human being due to the characteristics of an image is referred to as "significance", and the numerical value representing the degree of remarkableness is referred to as "significance". For example, the part where the human retina reacts strongly when the human looks at the evaluation object has a high degree of prominence. The attractiveness estimation unit 353 generates a saliency map showing the distribution of saliency with respect to the entire image of the evaluation target. When the saliency map is displayed in color, the portion having high saliency is displayed in a warm color such as red, and the portion having low saliency is displayed in a cool color. Blue corresponds to the ground level with the lowest saliency.
 本実施形態では、公知の技術(例えば、C. Koch, L. Itti and E. Niebur,“A model saliency-based visual attention for rapid scene analysis”,IEEE Transaction on Pattern Analysis and Machine Intelligence, Vol. 20(1998))に記載された技術に基づいて、評価対象物の画像の顕著性マップを生成する。また、顕著性マップの生成方法は、例えば、特開2019-119164公報等にも記載されており、公知の技術であるので詳細な説明は省略するが、その概要は以下のとおりである。 In this embodiment, known techniques (for example, C. Koch, L. Itti and E. Niebur, “A model saturation-based visual attention for rapid scene analysis”, IEEE Transaction on Pattern Analysis and Machine Intelligence, Vol. 20 ( Based on the technique described in 1998)), a salience map of the image of the evaluation object is generated. Further, a method for generating a saliency map is also described in, for example, Japanese Patent Application Laid-Open No. 2019-119164, and since it is a known technique, detailed description thereof will be omitted, but the outline thereof is as follows.
 誘目度推定部353は、評価対象物の画像を1画素おきに複数回サンプリングして得た複数の画像を、各々評価対象物の画像と同じサイズに拡大した第1画像群を作成し、第1画像群に含まれる各画像のエッジの特徴を抽出した第2画像群を作成する。また、誘目度推定部353は、第1画像群に含まれる各画像の輝度の特徴を抽出した第3画像群を作成し、第1画像群に含まれる各画像のR(赤)、G(緑)、B(青)、Y(黄)の色相の特徴を抽出した第4画像群を作成する。そして、誘目度推定部353は、第2画像群、第3画像群、または第4画像群の少なくとも1つに基づいて単位領域毎に顕著度を算出する。また、誘目度推定部353は、第4画像群に含まれるR(赤)、G(緑)の画像の差分を抽出した第5画像群と、第4画像群に含まれるB(青)、Y(黄)の画像の差分を抽出した第6画像群とを作成する。そして、誘目度推定部353は、第2画像群、第3画像群、第5画像群、第6画像群を正規化して合算することで顕著性マップを作成する。 The attraction degree estimation unit 353 creates a first image group in which a plurality of images obtained by sampling the image of the evaluation object a plurality of times every other pixel and enlarging them to the same size as the image of the evaluation object are created. A second image group is created by extracting the edge features of each image included in one image group. Further, the attraction degree estimation unit 353 creates a third image group that extracts the characteristics of the brightness of each image included in the first image group, and R (red), G (red), G (red), G (red) of each image included in the first image group. A fourth image group is created by extracting the characteristics of the hues of green), B (blue), and Y (yellow). Then, the attraction degree estimation unit 353 calculates the degree of saliency for each unit region based on at least one of the second image group, the third image group, or the fourth image group. Further, the attraction degree estimation unit 353 has a fifth image group from which the differences between the R (red) and G (green) images included in the fourth image group are extracted, and B (blue) included in the fourth image group. A sixth image group obtained by extracting the difference between the Y (yellow) images is created. Then, the attraction degree estimation unit 353 creates a saliency map by normalizing and adding up the second image group, the third image group, the fifth image group, and the sixth image group.
 さらに、誘目度推定部353は、評価対象物の画像に設定された複数の領域(以下、「エリア」ともいう)の各エリアについて、顕著度に基づいて推定誘目度を算出する。推定誘目度の算出方法については後述する。 Further, the attraction degree estimation unit 353 calculates the estimated attraction degree for each area of a plurality of areas (hereinafter, also referred to as "areas") set in the image of the evaluation target object based on the degree of prominence. The method of calculating the estimated attractiveness will be described later.
 誘目度測定部354は、評価対象物の画像の誘目度を測定する。評価対象物の画像上において、ユーザーの視線位置は、オブジェクトの色や形状等の、画像が有する特徴よって目立つ部分、すなわち顕著度が高い部分や、ユーザーが関心を持っている部分に集中する。したがって、これらの部分(または領域)の誘目性が高まると考えられる。誘目度測定部354は、視線に関する情報、すなわち視線測定の結果に基づいて、各エリアにおける誘目度(「測定誘目度」という)を算出し、RAM320に保存する。測定誘目度の算出方法の詳細については後述する。 The attractiveness measuring unit 354 measures the attractiveness of the image of the evaluation target. On the image of the evaluation object, the line-of-sight position of the user is concentrated on a part that stands out due to the characteristics of the image, such as the color and shape of the object, that is, a part that has a high degree of prominence and a part that the user is interested in. Therefore, it is considered that the attractiveness of these parts (or areas) is enhanced. The attraction degree measuring unit 354 calculates the attraction degree (referred to as “measurement attraction degree”) in each area based on the information about the line of sight, that is, the result of the line-of-sight measurement, and stores it in the RAM 320. The details of the method of calculating the degree of attraction for measurement will be described later.
 差分算出部355は、誘目度測定部354によって算出された測定誘目度と、誘目度推定部353によって推定された推定誘目度との差分を算出する。より具体には、差分算出部355は、各エリアについて、測定誘目度と推定誘目度との差分(以下、「誘目度差分」ともいう)を算出し、RAM320に保存する。 The difference calculation unit 355 calculates the difference between the measurement attraction degree calculated by the attraction degree measurement unit 354 and the estimated attraction degree estimated by the attraction degree estimation unit 353. More specifically, the difference calculation unit 355 calculates the difference between the measured attractiveness and the estimated attractiveness (hereinafter, also referred to as “attracting difference”) for each area and stores it in the RAM 320.
 表示制御部356は、表示装置400のディスプレイ410への評価対象物の画像の表示を制御する。また、表示制御部356は、表現提供部357によって提供される言葉の表示についても制御する。表示制御部356は、例えば、評価対象物の画像および言葉をディスプレイ410に同時に表示させる。 The display control unit 356 controls the display of the image of the evaluation target on the display 410 of the display device 400. The display control unit 356 also controls the display of words provided by the expression providing unit 357. The display control unit 356 displays, for example, an image and words of the evaluation target on the display 410 at the same time.
 表現提供部357は、評価対象物のデザイン案に関する言葉をユーザー10に提供する。言葉は、評価対象物のデザイナーや商品メーカーが、消費者に訴求したい感性的な事項を鑑みて、評価対象物のデザイン案の種類、分野、内容等に応じて適宜選択されうる。したがって、この言葉は、評価対象物のデザインに関してユーザー10の感性に対応する言葉でありうる。例えば、評価対象物が食品のパッケージである場合、味覚に関する言葉である「おいしそう」や、ユーザー10の嗜好に関する言葉である「好き」等の言葉が選択されうる。また、言葉は、被験者によるモニター試験を行う際に、モニター試験の実施者等により予め設定されるように構成してもよい。 The expression providing unit 357 provides the user 10 with words regarding the design proposal of the evaluation target object. The words can be appropriately selected according to the type, field, content, etc. of the design proposal of the evaluation target in consideration of the emotional matters that the designer or product maker of the evaluation target wants to appeal to the consumer. Therefore, this word can be a word corresponding to the user 10's sensibility regarding the design of the evaluation object. For example, when the evaluation target is a food package, words such as "delicious", which is a word related to taste, and "like", which is a word related to the taste of the user 10, can be selected. In addition, the words may be configured to be preset by the performer of the monitor test or the like when the monitor test is performed by the subject.
 図4に示すように、表現提供部357は、評価対象物のデザイン案に関する複数の言葉をユーザー10に提供できる。これらの言葉は、モニター試験において「感性質問項目」として設定される。図4には、食品のパッケージのデザイン案に関する3つの言葉「斬新」、「おいしそう」、および「好き」が例示されている。 As shown in FIG. 4, the expression providing unit 357 can provide the user 10 with a plurality of words regarding the design proposal of the evaluation target object. These words are set as "sensitivity question items" in the monitor test. FIG. 4 exemplifies the three words “innovative”, “delicious”, and “like” regarding the design proposal of a food package.
 なお、ユーザー10に提示される言葉は、質問、形容詞、評価対象物の特徴等を含む単語、節、または文でありうる。また、言葉を表示装置400のディスプレイ410に表示する代わり、あるいはディスプレイ410に表示することに加えて、音声出力装置600のスピーカーによって音声でユーザー10に言葉や音を提供してもよい。さらに、言葉のような視覚的または聴覚的な情報の代わりに、あるいはこれらの情報に加えて、触覚的、嗅覚的、および味覚的な情報の少なくともいずれかをユーザー10に提供してもよい。以下、本明細書では、このような視覚的、聴覚的、触覚的、嗅覚的、または味覚的な情報を「表現」と呼ぶ。表現は、感性質問項目になりうる。以下、表現が言葉である場合について主に説明するが、表現が言葉以外のもの、例えば匂いや味である場合についても言葉の場合と同様である。 The word presented to the user 10 may be a word, a clause, or a sentence including a question, an adjective, a feature of the evaluation object, and the like. Further, instead of displaying the words on the display 410 of the display device 400, or in addition to displaying the words on the display 410, the words and sounds may be provided to the user 10 by voice by the speaker of the voice output device 600. Further, the user 10 may be provided with at least one of tactile, olfactory, and taste information in place of or in addition to visual or auditory information such as words. Hereinafter, in the present specification, such visual, auditory, tactile, olfactory, or taste information is referred to as "expression". Expressions can be sensitive question items. Hereinafter, the case where the expression is a word will be mainly described, but the case where the expression is something other than a word, for example, a smell or a taste, is the same as the case of a word.
 視線情報取得部358は、表示装置400に表示されている評価対象物の画像に対するユーザー10の視線に関する情報を取得する。視線情報取得部358は、視線測定装置200から、視線に関する情報として、例えば評価対象物の画像上におけるユーザー10の視線位置を取得する。 The line-of-sight information acquisition unit 358 acquires information on the line-of-sight of the user 10 with respect to the image of the evaluation object displayed on the display device 400. The line-of-sight information acquisition unit 358 acquires, for example, the line-of-sight position of the user 10 on the image of the evaluation target as information on the line-of-sight from the line-of-sight measuring device 200.
 感性情報取得部359は、評価対象物のデザインに関するユーザー10の感性情報を算出する。本実施形態では、デザイン案の評価は、例えば、各感性質問項目について、ユーザー10がデザイン案の点数を回答することによって行われる。ユーザー10は、入力装置500のキーボードやマウスを使用して、デザイン案の点数を入力する。感性情報取得部359は、各感性質問項目に対するユーザー10の評価を受け付け、当該評価に基づいて、各感性質問項目とデザイン案に関するユーザー10の感性との関係性を示す感性情報を算出し、RAM320に保存する。 The Kansei information acquisition unit 359 calculates the Kansei information of the user 10 regarding the design of the evaluation target object. In the present embodiment, the evaluation of the design proposal is performed, for example, by the user 10 answering the score of the design proposal for each sensitivity question item. The user 10 inputs the score of the design proposal by using the keyboard and the mouse of the input device 500. The Kansei information acquisition unit 359 receives the evaluation of the user 10 for each Kansei question item, calculates the Kansei information indicating the relationship between each Kansei question item and the user 10's Kansei regarding the design proposal based on the evaluation, and RAM320. Save to.
 なお、視線情報取得部358は、表現提供部357によって言葉が提供され、表示装置400のディスプレイ410に言葉が表示されてから、この言葉に対するユーザー10の評価が完了するまでの間に連続的に視線に関する情報を取得する。すなわち、表現提供部357による言葉の提示と、感性情報取得部359によるユーザー10の評価の受け付けと、視線測定装置200による視線20の測定とは同時並行的に実行されうる。 In the line-of-sight information acquisition unit 358, words are provided by the expression providing unit 357, and the words are continuously displayed on the display 410 of the display device 400 until the user 10's evaluation of the words is completed. Get information about the line of sight. That is, the presentation of words by the expression providing unit 357, the acceptance of the evaluation of the user 10 by the sensitivity information acquisition unit 359, and the measurement of the line of sight 20 by the line of sight measuring device 200 can be executed in parallel.
 誘目度換算値算出部360は、各エリアにおける測定誘目度と感性情報とに基づいて誘目度換算値を算出し、RAM320に保存する。誘目度換算値の算出方法の詳細については後述する。 The attraction degree conversion value calculation unit 360 calculates the attraction degree conversion value based on the measured attraction degree and the sensitivity information in each area, and stores it in the RAM 320. The details of the method of calculating the attractiveness conversion value will be described later.
 感性抽出部361は、誘目度差分と複数の感性質問項目(複数の表現)41~43との関連性を抽出する。感性抽出部361は、例えば、誘目度差分と誘目度換算値との相関関係に基づいて誘目度差分と複数の言葉との関連性を抽出する。また、感性抽出部361は、後述する統計学的手法や機械学習的手法によっても上記関連性を抽出することもできる。 The Kansei extraction unit 361 extracts the relationship between the attractiveness difference and the plurality of Kansei question items (plural expressions) 41 to 43. The sensitivity extraction unit 361 extracts, for example, the relationship between the attractiveness difference and a plurality of words based on the correlation between the attractiveness difference and the attractiveness conversion value. In addition, the sensitivity extraction unit 361 can also extract the above-mentioned relationship by a statistical method or a machine learning method described later.
 感性出力部362は、誘目度差分と誘目度換算値との相関関係のグラフや、誘目度差分と複数の感性質問項目41~43との関連性の抽出結果等をディスプレイ410に出力する。 The sensitivity output unit 362 outputs a graph of the correlation between the attraction degree difference and the attraction degree conversion value, the extraction result of the relationship between the attraction degree difference and the plurality of sensitivity question items 41 to 43, and the like on the display 410.
 <デザイン評価方法>
 図5A、図5Bは第1の実施形態に係る制御装置300のデザイン評価方法を例示するフローチャートであり、図6は評価対象物のデザイン案を例示する模式図である。図5A、図5Bのフローチャートの処理は、CPU310がデザイン評価プログラムを実行することにより実現される。また、図7はデザイン案の画像に対して設定された複数のエリア1~4を例示する模式図であり、図8はデザイン案の画像の顕著性マップを例示する図であり、図9はデザイン案のエリア1~4における推定誘目度を例示する図である。また、図10は、デザイン案の画像を標準状態で表示した場合を例示する模式図であり、図11はデザイン案の画像を視ているユーザー10の視線位置の分布を例示する模式図である。図12,図13はデザイン案の画像のエリア1~4における測定誘目度、および誘目度差分を各々例示する図であり、図14A~図14Cはデザイン案の画像、および各感性質問項目の表示を例示する模式図である。図15は各感性質問項目について、デザイン案の画像のエリア1~4における測定誘目度を例示する図であり、図16は各感性質問項目について、ユーザー10によってデザイン案に関して入力された点数を例示する図であり、図17は各感性質問項目について、デザイン案の画像のエリア1~4における誘目度換算値を例示する図である。図18A~図18Cは、各感性質問項目について、誘目度差分と誘目度換算値との相関関係を例示するグラフである。
<Design evaluation method>
5A and 5B are flowcharts illustrating the design evaluation method of the control device 300 according to the first embodiment, and FIG. 6 is a schematic diagram illustrating a design proposal of the evaluation object. The processing of the flowcharts of FIGS. 5A and 5B is realized by the CPU 310 executing the design evaluation program. Further, FIG. 7 is a schematic diagram illustrating a plurality of areas 1 to 4 set for the image of the design proposal, FIG. 8 is a diagram illustrating a salency map of the image of the design proposal, and FIG. 9 is a diagram. It is a figure which exemplifies the estimated degree of attraction in the area 1 to 4 of a design proposal. Further, FIG. 10 is a schematic diagram illustrating a case where the image of the design proposal is displayed in a standard state, and FIG. 11 is a schematic diagram illustrating the distribution of the line-of-sight position of the user 10 who is viewing the image of the design proposal. .. 12 and 13 are diagrams illustrating the measurement attraction degree and the attraction degree difference in areas 1 to 4 of the image of the design proposal, respectively, and FIGS. 14A to 14C show the image of the design proposal and the display of each sensitivity question item. It is a schematic diagram which illustrates. FIG. 15 is a diagram illustrating the degree of attraction for measurement in areas 1 to 4 of the image of the design proposal for each sensitivity question item, and FIG. 16 illustrates the score input for the design proposal by the user 10 for each sensitivity question item. FIG. 17 is a diagram illustrating the attractiveness conversion value in the areas 1 to 4 of the image of the design proposal for each sensitivity question item. 18A to 18C are graphs illustrating the correlation between the attractiveness difference and the attractiveness conversion value for each sensitivity question item.
 以下では、例えば、ある食品のパッケージデザインを評価する場合を例示して、制御装置300のデザイン評価方法について説明する。本実施形態では、パッケージのデザイン案が出来上がった後、ユーザー10が被験者となってデザイン案を評価するモニター試験を実施する場合を想定する。 In the following, for example, a case of evaluating a package design of a certain food will be illustrated, and a design evaluation method of the control device 300 will be described. In the present embodiment, it is assumed that after the design proposal of the package is completed, the user 10 acts as a subject and conducts a monitor test for evaluating the design proposal.
 モニター試験では、デザイン評価システム100は、デザイン案の画像と、デザインに関する言葉とをディスプレイ410に表示し、デザイン案の画像に関して、ユーザー10の感性に基づく誘目度と、複数の言葉との関連性を抽出する。より具体には、以下のとおりである。 In the monitor test, the design evaluation system 100 displays an image of the design proposal and words related to the design on the display 410, and with respect to the image of the design proposal, the degree of attraction based on the user 10's sensibilities and the relationship with a plurality of words. To extract. More specifically, it is as follows.
 まず、評価対象物の画像を取得する(ステップS101)。画像取得部311は、評価対象物の画像として、補助記憶部340に保存されている複数の画像のうち、例えば、ある食品のパッケージの画像を取得する(図6を参照)。以下、この評価対象物のデザイン案の画像を「デザイン案画像」と称する。図6のデザイン案は、例えば、「砂時計」と「食材」とを組み合わせた斬新なデザイン案である。 First, an image of the evaluation target is acquired (step S101). The image acquisition unit 311 acquires, for example, an image of a package of a certain food among a plurality of images stored in the auxiliary storage unit 340 as an image of the evaluation target (see FIG. 6). Hereinafter, the image of the design proposal of this evaluation object is referred to as a "design proposal image". The design proposal of FIG. 6 is, for example, a novel design proposal that combines an “hourglass” and a “food ingredient”.
 次に、ユーザー10の属性に関する情報を取得する(ステップS102)。属性情報取得部352は、例えば、補助記憶部340からユーザー10の属性に関する情報を読み取る。例えば、本実施形態では、ユーザー10が20代の女性であり、都心の会社に勤務している場合を想定する。この場合、ユーザー10の属性に関する情報は、性別:女性、年齢:20代、所在地域:東京都等の情報を含む。 Next, the information regarding the attribute of the user 10 is acquired (step S102). The attribute information acquisition unit 352 reads, for example, information about the attributes of the user 10 from the auxiliary storage unit 340. For example, in this embodiment, it is assumed that the user 10 is a woman in her twenties and works for a company in the city center. In this case, the information regarding the attributes of the user 10 includes information such as gender: female, age: 20s, location area: Tokyo, and the like.
 次に、評価対象物の画像の各エリアについて誘目度を推定する(ステップS103)。図7に示すように、誘目度推定部353は、例えば、公知の画像認識技術を使用して、デザイン案画像30内のオブジェクト(文字、図等)を識別し、識別結果に基づいて、デザイン案画像30に対して、互いに重ならないように分割された複数のエリア(例えば、エリア1~4)を設定する。エリア1は、デザイン案画像30の右端部の「塩糀の砂時計」という文字列を含むエリアであり、エリア2は、デザイン案画像30の中央部の「砂時計」のイラストを含むエリアである。また、エリア3は、デザイン案画像30の下部の「〇〇〇〇〇〇/CORPORATION」という文字列を含むエリアであり、エリア4は、デザイン案画像30の左端部の「牡蠣と檸檬のマリアージュ」という文字列を含むエリアである。 Next, the degree of attraction is estimated for each area of the image of the evaluation target (step S103). As shown in FIG. 7, the attraction degree estimation unit 353 identifies an object (character, figure, etc.) in the design proposal image 30 by using, for example, a known image recognition technique, and designs based on the identification result. A plurality of areas (for example, areas 1 to 4) divided so as not to overlap each other are set for the draft image 30. Area 1 is an area including the character string "hourglass of salt 糀" at the right end of the design proposal image 30, and area 2 is an area including an illustration of "hourglass" at the center of the design proposal image 30. Further, the area 3 is an area including the character string "0000OO / CORPORATION" at the lower part of the design proposal image 30, and the area 4 is a mariage of "oysters and lemons" at the left end of the design proposal image 30. It is an area containing the character string.
 続いて、図8に示すように、誘目度推定部353は、デザイン案画像30について顕著性マップを生成する。同図の例では、エリア1の「塩糀の砂時計」における「糀」(A1)や「計」の文字上(A2)、エリア2の「砂時計」のイラストの中央部(A3)、およびエリア3の文字上(A4~A6)に、顕著度が高い部分がある。また、エリア2の「砂時計」のイラストの輪郭部(A7)や、エリア4の「牡蠣と檸檬のマリアージュ」の文字上(A8)に、顕著度が中程度の部分がある。カラー表示の場合、顕著度が高い部分A1~A6は、例えば、赤色、黄色等の色で表示され、顕著度が中程度の部分A7,A8は、例えば、緑色等で表示されうる。 Subsequently, as shown in FIG. 8, the attractiveness estimation unit 353 generates a saliency map for the design proposal image 30. In the example of the figure, in the "hourglass of salt 糀" in area 1, the letters "糀" (A1) and "total" (A2), the central part (A3) of the illustration of "hourglass" in area 2, and the area There is a highly prominent part on the character 3 (A4 to A6). In addition, the outline of the illustration of the "hourglass" in area 2 (A7) and the text (A8) of the "mariage of oysters and lemons" in area 4 have a medium degree of prominence. In the case of color display, the portions A1 to A6 having high saliency may be displayed in colors such as red and yellow, and the portions A7 and A8 having moderate saliency may be displayed in colors such as green.
 デザイン案画像30内の顕著度が高い部分は、ユーザー10の注意を引くため、誘目性が高いと考えられる。誘目度推定部353は、各エリアにおける推定誘目度を算出する。本実施形態では、各エリアにおける推定誘目度は、デザイン案画像30の特徴による、ユーザー10の視覚的な注意の引きやすさの程度を表し、例えば、各エリアにおける顕著度の、全エリアについての顕著度の和に対する割合として定義されうる。したがって、顕著度が高いほど各エリアにおける推定誘目度は高くなり、顕著度が低いほど各エリアにおける推定誘目度は低くなる。推定誘目度を全エリアについて合計すると1.0となる。 The part of the design proposal image 30 with a high degree of prominence is considered to be highly attractive because it attracts the attention of the user 10. The attraction degree estimation unit 353 calculates the estimated attraction degree in each area. In the present embodiment, the estimated degree of attraction in each area represents the degree of ease of visual attention of the user 10 due to the characteristics of the design proposal image 30, for example, the degree of prominence in each area for all areas. It can be defined as a percentage of the sum of saliency. Therefore, the higher the degree of saliency, the higher the estimated degree of attraction in each area, and the lower the degree of saliency, the lower the estimated degree of attraction in each area. The total estimated attraction is 1.0 for all areas.
 図9には、エリア1~4における推定誘目度が例示されている。同図の例では、エリア2における推定誘目度は0.52であり、他のエリア1,エリア3,およびエリア4の推定誘目度よりも高い。したがって、エリア2は、他のエリア1,3,4よりもユーザー10の注意を引く可能性が高いと考えられる。すなわち、デザイン案画像30は、エリア2の顕著性によって、もともとエリア2において50%を超える誘目度を有している。このため、デザイン案画像30のエリア2は、ユーザー10に限らず、他のユーザーの注意も引くと考えられる。 FIG. 9 illustrates the estimated degree of attraction in areas 1 to 4. In the example of the figure, the estimated attractiveness in area 2 is 0.52, which is higher than the estimated attractiveness in other areas 1, 3 and 4. Therefore, it is considered that the area 2 is more likely to attract the attention of the user 10 than the other areas 1, 3 and 4. That is, the design proposal image 30 originally has an attractiveness of more than 50% in the area 2 due to the prominence of the area 2. Therefore, it is considered that the area 2 of the design proposal image 30 attracts the attention of not only the user 10 but also other users.
 次に、評価対象物の画像を表示する(ステップS104)。図10に示すように、表示制御部356は、デザイン案画像30を表示装置400のディスプレイ410に表示するように制御する。以下、本実施形態では、感性質問項目を伴わない状態で、デザイン案画像30のみを表示することを「標準状態での表示」という。標準状態での表示は、所定時間(例えば、10秒間)行われる。 Next, the image of the evaluation target is displayed (step S104). As shown in FIG. 10, the display control unit 356 controls the design proposal image 30 to be displayed on the display 410 of the display device 400. Hereinafter, in the present embodiment, displaying only the design proposal image 30 without accompanying the sensitivity question item is referred to as "display in the standard state". The display in the standard state is performed for a predetermined time (for example, 10 seconds).
 次に、表示装置400に表示された評価対象物の画像に対するユーザー10の視線に関する情報を取得する(ステップS105)。視線測定装置200は、表示装置400に表示されたデザイン案画像30に対するユーザー10の視線を測定し、視線に関する情報を視線情報取得部358に送信する。なお、モニター試験の開始前には、ユーザー10の眼に合わせて視線測定装置200のキャリブレーションが行われる。キャリブレーションでは、ユーザー10の視線位置を正確に算出するために、ユーザー10の眼の幾何学的特徴が取得される。 Next, the information regarding the line of sight of the user 10 with respect to the image of the evaluation object displayed on the display device 400 is acquired (step S105). The line-of-sight measuring device 200 measures the line of sight of the user 10 with respect to the design proposal image 30 displayed on the display device 400, and transmits information about the line of sight to the line-of-sight information acquisition unit 358. Before the start of the monitor test, the line-of-sight measuring device 200 is calibrated according to the eyes of the user 10. In the calibration, the geometric features of the user 10's eyes are acquired in order to accurately calculate the user 10's line-of-sight position.
 視線測定装置200は、ユーザー10の視線を上記所定時間にわたり連続的に測定し、表示制御部356は、この間、ディスプレイ410にユーザー10の視線位置の軌跡、すなわち、上記所定時間における視線位置の分布を表示させる。これにより、モニター試験の実施者等は、ディスプレイ410の表示面上におけるユーザー10の視線位置の経時的な変化を容易に把握できる。所定時間における視線位置の分布のデータは、RAM320に保存される。 The line-of-sight measuring device 200 continuously measures the line-of-sight of the user 10 over the predetermined time, and during this period, the display control unit 356 displays the locus of the line-of-sight position of the user 10 on the display 410, that is, the distribution of the line-of-sight position at the predetermined time. Is displayed. As a result, the person performing the monitor test or the like can easily grasp the change over time in the line-of-sight position of the user 10 on the display surface of the display 410. The data of the distribution of the line-of-sight position at a predetermined time is stored in the RAM 320.
 ユーザー10の視線位置が、デザイン案画像30における同一の位置やその周辺に長時間留まっている場合、その位置やその周辺をユーザー10が注視していることが推定され、ユーザー10が、注視している箇所の内容に関心があると考えられる。 When the line-of-sight position of the user 10 stays at the same position in the design proposal image 30 or its surroundings for a long time, it is estimated that the user 10 is gazing at the position or its surroundings, and the user 10 is gazing at the position or its surroundings. It is considered that you are interested in the contents of the part.
 図11に示すように、視線位置の分布は、ヒートマップで例示されうる。例えば、カラーのヒートマップである場合、視線の集中が強い部分は暖色系(図では色の濃い部分)、視線の集中が弱い部分は寒色系で塗り分けられる。例えば、「砂時計」のイラストの下部には、視線の強い集中帯(色の濃い部分、以下同様)B1があり、上部には弱い集中帯(色の薄い部分、以下同様)B2がある。 As shown in FIG. 11, the distribution of the line-of-sight position can be exemplified by a heat map. For example, in the case of a color heat map, the part where the line of sight is concentrated is painted in a warm color system (the part where the line of sight is dark in the figure), and the part where the line of sight is weakly concentrated is painted in a cool color system. For example, in the lower part of the illustration of the "hourglass", there is a concentrated zone (dark part, the same applies hereinafter) B1 with a strong line of sight, and in the upper part, there is a concentrated zone B2 (a light colored part, the same applies hereinafter).
 次に、評価対象物の画像の各エリアについて誘目度を測定する(ステップS106)。誘目度測定部354は、視線に関する情報に基づいて、デザイン案画像30の各エリアについて誘目度を測定する。本明細書において、測定誘目度は、例えば、各エリアにおける視線位置の滞留時間の、各エリアの滞留時間の合計に対する割合として定義される。ここでは、各エリアの滞留時間の合計は、上記所定時間に等しい。したがって、視線位置が同一のエリアに留まる時間が長いほどそのエリアにおける測定誘目度は高くなり、短いほど測定誘目度は低くなる。測定誘目度を全エリアについて合計すると1.0になる。 Next, the degree of attraction is measured for each area of the image of the evaluation target (step S106). The attraction degree measuring unit 354 measures the attraction degree for each area of the design proposal image 30 based on the information regarding the line of sight. In the present specification, the measurement attraction is defined as, for example, the ratio of the residence time of the line-of-sight position in each area to the total residence time of each area. Here, the total residence time of each area is equal to the above-mentioned predetermined time. Therefore, the longer the line-of-sight position stays in the same area, the higher the degree of attraction for measurement in that area, and the shorter the position, the lower the degree of attraction for measurement. The total degree of attraction for measurement is 1.0 for all areas.
 図12には、エリア1~4における測定誘目度が例示されている。同図の例では、エリア2における測定誘目度は0.82であり、他のエリア1、エリア3、およびエリア4の測定誘目度よりも高い。したがって、エリア2は、他のエリア1,3,4よりもユーザー10の注意を引いていると考えられる。同図の例では、所定時間の80%を超える時間において、ユーザー10の視線位置はエリア2にある。 FIG. 12 illustrates the degree of attraction for measurement in areas 1 to 4. In the example of the figure, the measurement attraction degree in the area 2 is 0.82, which is higher than the measurement attraction degree in the other areas 1, the area 3, and the area 4. Therefore, it is considered that the area 2 attracts the attention of the user 10 more than the other areas 1, 3 and 4. In the example of the figure, the line-of-sight position of the user 10 is in the area 2 at a time exceeding 80% of the predetermined time.
 次に、各エリアについて測定誘目度と推定誘目度との差分を算出する(ステップS107)。差分算出部355は、各エリアについて測定誘目度と推定誘目度との差分(誘目度差分)を算出する。図13には、エリア1~4における誘目度差分が例示されている。上述のとおり、測定誘目度には、デザイン案画像30の顕著性に起因する誘目度(すなわち、推定誘目度)が含まれていると考えられる。そこで、測定誘目度から推定誘目度を差し引くことにより、ユーザー10の感性に基づく誘目度を算出できる。 Next, the difference between the measured attractiveness and the estimated attractiveness is calculated for each area (step S107). The difference calculation unit 355 calculates the difference between the measured attractiveness and the estimated attractiveness (attracting difference) for each area. FIG. 13 illustrates the difference in attractiveness in areas 1 to 4. As described above, it is considered that the measured attractiveness includes the attractiveness (that is, the estimated attractiveness) due to the prominence of the design proposal image 30. Therefore, by subtracting the estimated attractiveness from the measured attractiveness, the attractiveness based on the sensibility of the user 10 can be calculated.
 デザイン案画像30の各エリアにおいて、誘目度差分が概ね0(ゼロ)の場合(または、デザイン案画像30において誘目度差分が概ね0の箇所)は、デザイン案画像30の顕著性に起因する誘目度のみが含まれ、ユーザー10の感性に基づく誘目度は、概ね0、または極めて小さいと考えられる。また、誘目度差分が正値の場合(または、デザイン案画像30において誘目度差分が正値の箇所)は、顕著性に起因する誘目度に加えて、ユーザー10の感性に基づく誘目度が含まれていると考えられる。これは、デザイン案画像30の上記箇所により、ユーザー10の感性が刺激され、ユーザー10の視線位置が上記箇所に移動したためと考えられる。一方、誘目度差分が負値の場合(または、デザイン案画像30において誘目度差分が負値の箇所)、あるいは誘目度差分が正値であっても小さい場合は、顕著性に起因する誘目度が推定されていたにも関わらず、実際の誘目度が、それとほとんど同じか、それよりも低かったと考えられる。 When the attractiveness difference is approximately 0 (zero) in each area of the design proposal image 30 (or where the attractiveness difference is approximately 0 in the design proposal image 30), the attraction due to the prominence of the design proposal image 30 Only the degree is included, and the degree of attraction based on the user 10's sensibilities is considered to be approximately 0 or extremely small. Further, when the attractiveness difference is a positive value (or the place where the attractiveness difference is a positive value in the design proposal image 30), the attractiveness based on the user's sensibilities is included in addition to the attractiveness caused by the prominence. It is thought that it has been done. It is considered that this is because the sensibility of the user 10 is stimulated by the above-mentioned portion of the design proposal image 30, and the line-of-sight position of the user 10 is moved to the above-mentioned portion. On the other hand, when the attractiveness difference is a negative value (or where the attractiveness difference is a negative value in the design proposal image 30), or when the attractiveness difference is a positive value but small, the attractiveness due to the remarkableness is caused. It is probable that the actual degree of attraction was almost the same as or lower than that, even though it was estimated.
 このように、誘目度差分は、ユーザー10の感性に基づく誘目度に相当し、誘目度差分が高いほど、測定誘目度におけるユーザー10の感性による寄与分が大きい。 As described above, the attractiveness difference corresponds to the attractiveness based on the sensibility of the user 10, and the higher the attractiveness difference, the greater the contribution of the user 10's sensibility to the measured attractiveness.
 次に、図5Bに示すように、評価対象物のデザイン案に関する複数の感性質問項目41~43をユーザー10に提供する(ステップS108)。表現提供部357は、感性質問項目として、例えば、「斬新」、「おいしそう」、「好き」の3つの言葉を選択する。 Next, as shown in FIG. 5B, a plurality of sensitivity question items 41 to 43 regarding the design proposal of the evaluation target are provided to the user 10 (step S108). The expression providing unit 357 selects three words, for example, "innovative", "delicious", and "like" as the sensitivity question items.
 図14A~図14Cに示すように、例えば、デザイン案画像30を表示させたままにして、3つの感性質問項目41~43を順番に1つずつ表示させ、ユーザー10がこれらの感性質問項目41~43に関して、デザイン案を評価する場合を想定する。この場合、表示制御部356は、例えば、これらの3つの感性質問項目41~43を所定の時間間隔をあけて順番に1つずつディスプレイ410に表示するように制御する。ここで、所定の時間間隔は、各感性質問項目についてユーザー10がデザイン案の評価を十分に行える時間であり、例えば、後述する「制限時間」に合わせて10秒に設定されうる。 As shown in FIGS. 14A to 14C, for example, while the design proposal image 30 is displayed, the three sensitivity question items 41 to 43 are displayed one by one in order, and the user 10 displays these sensitivity question items 41. It is assumed that the design proposal is evaluated with respect to ~ 43. In this case, the display control unit 356 controls, for example, to display these three sensitivity question items 41 to 43 on the display 410 one by one in order at predetermined time intervals. Here, the predetermined time interval is a time during which the user 10 can sufficiently evaluate the design proposal for each sensitivity question item, and can be set to, for example, 10 seconds in accordance with the "time limit" described later.
 表示制御部356は、各感性質問項目をディスプレイ410に表示させる際に、ユーザー10に対して、表示される各感性質問項目について、例えば、「キーボードで点数を入力してください(1~10点)」のように、デザイン案の評価を行うことを依頼する旨のメッセージを併せて表示する。 When displaying each sensitivity question item on the display 410, the display control unit 356 asks the user 10 for each sensitivity question item to be displayed, for example, "Enter a score with the keyboard (1 to 10 points). ) ”, A message requesting the evaluation of the design proposal is also displayed.
 また、モニター試験は、潜在的な意識に関する試験である。そのため、ユーザー10は、各感性質問項目について、デザイン案をできる限り短い時間で評価すること、感性質問項目は3つあり、10秒間隔で次の感性質問項目に切り替わること等、をモニター試験の開始前に説明を受ける。表示制御部356は、ユーザー10に制限時間を確認するため、制限時間をメッセージに含めてディスプレイ410に表示するように構成されてもよい。また、感性質問項目は、3つに限定されず、2つ、または4つ以上あってもよい。 Also, the monitor test is a test related to latent consciousness. Therefore, the user 10 evaluates the design proposal for each sensitivity question item in the shortest possible time, has three sensitivity question items, and switches to the next sensitivity question item at 10-second intervals, etc. in the monitor test. Get an explanation before the start. The display control unit 356 may be configured to include the time limit in the message and display it on the display 410 in order to confirm the time limit to the user 10. Further, the sensitivity question items are not limited to three, and may be two or four or more.
 次に、表示装置400に表示された評価対象物の画像に対するユーザー10の視線に関する情報を取得する(ステップS109)。視線測定装置200は、表示装置400に表示されたデザイン案画像30に対するユーザー10の視線を測定し、視線に関する情報を視線情報取得部358に送信する。 Next, the information regarding the line of sight of the user 10 with respect to the image of the evaluation object displayed on the display device 400 is acquired (step S109). The line-of-sight measuring device 200 measures the line of sight of the user 10 with respect to the design proposal image 30 displayed on the display device 400, and transmits information about the line of sight to the line-of-sight information acquisition unit 358.
 ユーザー10は、各感性質問項目について、自身の感性に従ってデザイン案を評価する。具体的には、ユーザー10は、入力装置500のキーボードやマウスを使用して、デザイン案の点数を入力する。 User 10 evaluates the design proposal according to his / her own sensibility for each sensibility question item. Specifically, the user 10 inputs the score of the design proposal by using the keyboard and the mouse of the input device 500.
 また、点数の入力は、所定の制限時間(例えば、10秒)以内に行われるように設定されうる。例えば、制限時間以内に回答が行われなかった場合は、表示中の質問感性項目についての点数の入力を終了し、未表示の感性質問項目がある場合は、次の感性質問項目に対する回答に移行するように制御されうる。これは、ユーザー10の感性に従って行われたデザイン案の評価を取得するため、およびユーザー10(被験者)に対して過度の負担をかけないためである。なお、このような場合に限定されず、3つの感性質問項目のうち1つでも制限時間以内に回答が行われなかった場合、モニター試験を強制的に終了するように構成してもよい。 Further, the input of the score can be set so as to be performed within a predetermined time limit (for example, 10 seconds). For example, if the answer is not given within the time limit, the input of the score for the displayed question sensitivity item is completed, and if there is an undisplayed sensitivity question item, the answer is moved to the next sensitivity question item. Can be controlled to do so. This is to obtain the evaluation of the design proposal made according to the sensibility of the user 10 and to not put an undue burden on the user 10 (subject). It should be noted that the present invention is not limited to such a case, and if even one of the three sensitivity question items is not answered within the time limit, the monitor test may be forcibly terminated.
 本実施形態では、ユーザー10が、知識や、論理的な思考に基づいてデザイン案を評価せずに自身の感性で直感的に評価するように、回答に対する制限時間を設けている。ユーザー10の回答が制限時間を超えた感性質問項目については、無回答、または無効回答として扱われる。 In this embodiment, a time limit is set for the answer so that the user 10 can intuitively evaluate the design proposal based on his / her own sensibility without evaluating the design proposal based on knowledge or logical thinking. Sensitive question items for which the answer of the user 10 exceeds the time limit are treated as no answer or invalid answer.
 なお、無回答や無効回答であった場合は、ユーザー10の感性に関する具体的な情報を得ることはできないが、ユーザー10が制限時間内にデザイン案を評価できなかったことも1つの情報と考えられる。したがって、ユーザー10が制限時間内にデザイン案を評価できず、有効な回答を取得できなかった場合についても、ユーザー10の視線に関する情報は保存され、ユーザー10がデザイン案を評価できなかった場合の視線に関する情報として活用されうる。 If there is no answer or an invalid answer, it is not possible to obtain specific information about the user 10's sensibility, but it is also considered that the user 10 could not evaluate the design proposal within the time limit. Be done. Therefore, even if the user 10 cannot evaluate the design proposal within the time limit and cannot obtain a valid answer, the information regarding the line of sight of the user 10 is saved and the user 10 cannot evaluate the design proposal. It can be used as information about the line of sight.
 また、無回答や無効回答であった感性質問項目について、試験をやり直すこともできるが、試験のやり直しで得られた測定結果等は、有効回答の測定結果を補完する目的で使用されうる。 Although it is possible to redo the test for sensitive question items that were unanswered or invalid answers, the measurement results obtained by redoing the test can be used for the purpose of supplementing the measurement results of valid answers.
 また、感性情報取得部359は、各感性質問項目について、回答時間の長短から、ユーザー10の回答が、潜在的(感性に従った)回答か、あるいは理性的に判断した可能性が高い回答かを判定しうる。すなわち、回答時間が短い場合、ユーザー10は潜在的に判断しており、回答時間が長い場合、ユーザー10は理性的に判断したと推定される。 In addition, the Kansei information acquisition unit 359 has a high possibility that the answer of the user 10 is a latent (according to Kansei) answer or a rational judgment based on the length of the answer time for each Kansei question item. Can be determined. That is, it is presumed that when the response time is short, the user 10 makes a potential judgment, and when the response time is long, the user 10 makes a rational judgment.
 潜在的/理性的を区分する時間的な閾値は、とくに限定されるものではなく、評価対象物のデザインの複雑度に応じて適宜設定されうる。例えば、図6のデザイン案のようにデザインが複雑ではない場合、閾値は4~5秒に設定されうる。一方、デザインが複雑である場合や、デザイン中に商品の説明が含まれる場合等には、ユーザー10が説明を読む時間を考慮して、制限時間を延長した上で、閾値を9~10秒程度に設定してもよい。 The temporal threshold for dividing potential / rational is not particularly limited, and can be appropriately set according to the complexity of the design of the evaluation object. For example, if the design is not complicated as in the design proposal of FIG. 6, the threshold value can be set to 4 to 5 seconds. On the other hand, when the design is complicated or the product description is included in the design, the threshold value is set to 9 to 10 seconds after extending the time limit in consideration of the time for the user 10 to read the description. It may be set to a degree.
 次に、感性情報取得部359は、各感性質問項目に対して、ユーザー10によるデザイン案に関する評価が完了したか否かを判定する(ステップS110)。判定の結果、評価が完了していない場合(ステップS110:NO)、S109の処理に戻る。 Next, the Kansei information acquisition unit 359 determines whether or not the evaluation of the design proposal by the user 10 has been completed for each Kansei question item (step S110). As a result of the determination, if the evaluation is not completed (step S110: NO), the process returns to the process of S109.
 一方、デザイン案に関する評価が完了している場合(ステップS110:YES)、各感性質問項目に対して、各エリアにおける測定誘目度を算出する(ステップS111)。上述のように、測定誘目度は、各エリアにおける視線位置の滞留時間の、各エリアの滞留時間の合計に対する割合として算出されるが、ここでの各エリアの滞留時間の合計は、各感性質問項目に対する回答時間に等しい。したがって、各エリアの滞留時間の合計は、感性質問項目41~43によって異なりうる。図15には、各感性質問項目に対して、エリア1~4における測定誘目度が例示されている。例えば、「斬新」を提示したときの測定誘目度は、エリア1~4において、各々0.04,0.83,0.10,および0.03である。測定誘目度の全エリアの合計は、1.0である。 On the other hand, when the evaluation of the design proposal is completed (step S110: YES), the measurement attraction degree in each area is calculated for each sensitivity question item (step S111). As described above, the degree of attraction for measurement is calculated as the ratio of the residence time of the line-of-sight position in each area to the total residence time of each area, and the total residence time of each area here is the sensitivity question. Equal to the response time for the item. Therefore, the total residence time in each area may differ depending on the sensitivity question items 41 to 43. FIG. 15 illustrates the degree of attraction for measurement in areas 1 to 4 for each sensitivity question item. For example, the degree of attraction for measurement when "innovative" is presented is 0.04, 0.83, 0.10, and 0.03, respectively, in areas 1 to 4. The total of all areas of measurement attraction is 1.0.
 次に、評価対象物のデザインに関するユーザー10の感性情報を算出する(ステップS112)。感性情報取得部359は、デザイン案に関して、各感性質問項目を提示したときユーザー10によって入力された点数を取得する。図16には、各感性質問項目を提示したときユーザー10によってデザイン案に関して入力された点数が例示されている。同図に示す例では、「斬新」、「おいしそう」、および「好き」を提示したときの点数は、各々6点、3点、および8点である。 Next, the sensitivity information of the user 10 regarding the design of the evaluation target is calculated (step S112). The Kansei information acquisition unit 359 acquires the score input by the user 10 when presenting each Kansei question item with respect to the design proposal. FIG. 16 illustrates the points input by the user 10 regarding the design proposal when each sensitivity question item is presented. In the example shown in the figure, the scores for presenting "innovative", "delicious", and "like" are 6, 3, and 8, respectively.
 感性情報取得部359は、各感性質問項目に対するユーザー10の評価として上記点数を受け付け、当該評価に基づいて、各感性質問項目とデザイン案に関するユーザー10の感性との関連性を示す感性情報を算出する。 The Kansei information acquisition unit 359 receives the above points as the evaluation of the user 10 for each Kansei question item, and calculates the Kansei information indicating the relationship between each Kansei question item and the user 10's sensitivity regarding the design proposal based on the evaluation. do.
 例えば、感性情報取得部359は、各感性質問項目に対するユーザー10の評価に比例するように、各感性質問項目について感性情報を算出する。例えば、図16に例示したように、デザイン案に関するユーザー10の評価が、「斬新」、「おいしそう」、および「好き」に対して、各々6点、3点、および8点である場合、各感性質問項目の感性情報の比を6:3:8に設定する。 For example, the Kansei information acquisition unit 359 calculates Kansei information for each Kansei question item so as to be proportional to the evaluation of the user 10 for each Kansei question item. For example, as illustrated in FIG. 16, when the user 10's evaluation of the design proposal is 6 points, 3 points, and 8 points for "innovative", "delicious", and "like", respectively. Set the ratio of the sensitivity information of each sensitivity question item to 6: 3: 8.
 次に、各エリアについて誘目度換算値を算出する(ステップS113)。感性情報取得部359は、各感性質問項目に対して、各エリアにおける測定誘目度と感性情報とに基づいて、各エリアについて誘目度換算値を算出する。本実施形態では、誘目度換算値は、各エリアにおける測定誘目度と感性情報との積で表わされる。図15の例では、感性質問項目が「斬新」である場合、エリア1の測定誘目度は、0.04(図15のグレーで示す部分)であり、感性情報は6(図16のグレーで示す部分)である。したがって、誘目度換算値は、0.04×6=0.24≒0.2(図17のグレーで示す部分)となる。他のエリアについても同様に算出される。すなわち、誘目度換算値は、各エリアの測定誘目度を、各感性質問項目に応じた感性情報で重み付けをした値である。図17には、各感性質問項目に対する各エリアの誘目度換算値が例示されている。 Next, the attractiveness conversion value is calculated for each area (step S113). The Kansei information acquisition unit 359 calculates an attractiveness conversion value for each area based on the measured attractiveness and the Kansei information in each area for each Kansei question item. In the present embodiment, the attractiveness conversion value is represented by the product of the measured attractiveness and the sensitivity information in each area. In the example of FIG. 15, when the sensitivity question item is "innovative", the measurement attraction degree of area 1 is 0.04 (the part shown in gray in FIG. 15), and the sensitivity information is 6 (in gray in FIG. 16). The part shown). Therefore, the attractiveness conversion value is 0.04 × 6 = 0.24 ≈0.2 (the part shown in gray in FIG. 17). It is calculated in the same way for other areas. That is, the attractiveness conversion value is a value obtained by weighting the measured attractiveness of each area with the sensitivity information corresponding to each sensitivity question item. FIG. 17 illustrates the attractiveness conversion value of each area for each sensitivity question item.
 次に、誘目度差分と複数の感性質問項目41~43との関連性を抽出する(ステップS114)。図18A~図18Cに示すように、感性出力部362は、例えば、誘目度差分と誘目度換算値との相関関係を表すグラフをディスプレイ410に出力する。これにより、デザイン案画像30に関して、複数の感性質問項目41~43に対するユーザー10の感性が可視化される。 Next, the relationship between the difference in attractiveness and the plurality of sensitivity question items 41 to 43 is extracted (step S114). As shown in FIGS. 18A to 18C, the sensitivity output unit 362 outputs, for example, a graph showing the correlation between the attractiveness difference and the attractiveness conversion value to the display 410. As a result, the sensibilities of the user 10 for the plurality of sensibility question items 41 to 43 are visualized with respect to the design proposal image 30.
 図18Aに示す例では、感性質問項目が「斬新」である場合、エリア1,3,4において、誘目度差分は負値であり、エリア2において誘目度差分は0.3で正値となっている。また、図18B,図18Cの感性質問項目が「おいしそう」および「好き」についても同様である。したがって、図18A~図18Cに示す例では、感性抽出部361は、「斬新」、「おいしそう」および「好き」のいずれに対しても、エリア2では誘目度差分が0.3という比較的大きい正値であり、エリア1,3,4では誘目度差分が負値であるという関連性を抽出する。また、感性抽出部361は、この関連性に基づき、エリア2におけるユーザー10の感性に基づく誘目度は高く、エリア1,3,4におけるユーザー10の感性に基づく誘目度は低いと判定する。 In the example shown in FIG. 18A, when the sensitivity question item is "innovative", the attraction degree difference is a negative value in areas 1, 3 and 4, and the attraction degree difference is a positive value at 0.3 in area 2. ing. The same applies to the sensibility question items of FIGS. 18B and 18C for "delicious" and "like". Therefore, in the examples shown in FIGS. 18A to 18C, the sensitivity extraction unit 361 has a relatively attractive degree difference of 0.3 in the area 2 for any of "innovative", "delicious" and "like". It is a large positive value, and in areas 1, 3 and 4, the relevance that the attractiveness difference is a negative value is extracted. Further, the sensibility extraction unit 361 determines that the degree of attraction based on the sensibility of the user 10 in the area 2 is high and the degree of attraction based on the sensibility of the user 10 in the areas 1, 3 and 4 is low based on this relationship.
 また、図17において、感性質問項目41~43が「斬新」、「おいしそう」、および「好き」である場合に対する、エリア2の誘目度換算値は、各々5.0、1.0、および7.4である。「斬新」および「好き」の誘目度換算値は、「おいしそう」の誘目度換算値と比べて高い。すなわち、感性質問項目が「斬新」および「好き」である場合では、誘目度差分が大きい場合、誘目度換算値も大きく、「おいしそう」である場合と比較して、誘目度差分と誘目度換算値との相関が高い。 Further, in FIG. 17, when the sensitivity question items 41 to 43 are "innovative", "delicious", and "like", the attractiveness conversion values of the area 2 are 5.0, 1.0, and respectively. It is 7.4. The attractiveness conversion values of "innovative" and "like" are higher than the attractiveness conversion values of "delicious". That is, when the sensitivity question items are "innovative" and "like", when the attractiveness difference is large, the attractiveness conversion value is also large, and the attractiveness difference and the attractiveness are compared with the case where "it looks delicious". High correlation with converted value.
 誘目度差分は、ユーザー10の感性に基づく誘目度を反映しているが、感性質問項目によるユーザー10への影響については、加味していない。一方、誘目度換算値は、測定誘目度と感性情報との積であるので、感性情報により感性質問項目によるユーザー10への影響を加味している。したがって、誘目度差分を横軸、誘目度換算値を縦軸に取った直交座標のグラフにおいて、誘目度差分と誘目度換算値との相関が高いほど、感性質問項目によるユーザー10への影響が強いと判定される。 The attractiveness difference reflects the attractiveness based on the sensibility of the user 10, but does not take into account the influence of the sensibility question item on the user 10. On the other hand, since the attraction degree conversion value is the product of the measurement attraction degree and the sensitivity information, the influence of the sensitivity question item on the user 10 is added to the sensitivity information. Therefore, in a graph of Cartesian coordinates with the attraction degree difference on the horizontal axis and the attraction degree conversion value on the vertical axis, the higher the correlation between the attraction degree difference and the attraction degree conversion value, the more the influence of the sensitivity question item on the user 10. It is judged to be strong.
 このように、感性抽出部361は、誘目度差分と誘目度換算値とに基づいて、誘目度差分と複数の感性質問項目との関連性を抽出する。これにより、どの感性質問項目に起因して誘目度差分が大きくなっているのかを判定できる。 In this way, the sensitivity extraction unit 361 extracts the relationship between the attraction degree difference and the plurality of sensitivity question items based on the attraction degree difference and the attraction degree conversion value. This makes it possible to determine which sensitivity question item causes the difference in the degree of attraction to be large.
 また、感性抽出部361は、各エリアにおいて、誘目度差分と、複数の感性質問項目およびユーザーの属性との関連性を抽出する。上述のように、被験者であるユーザー10の属性として、年齢が20代、性別が女性、および居所が都心の会社であることが設定されている。このようなユーザー10の属性(バックグラウンド)によって、エリア2が他のエリアと比較して、ユーザー10の目を惹きつけている可能性があると考えられる。また、このようなユーザー10の属性により、エリア2において、「斬新」、および「好き」に対する印象が良い一方で、「おいしそう」に対する訴求は弱いと考えられる。 In addition, the Kansei extraction unit 361 extracts the relationship between the attractiveness difference and a plurality of Kansei question items and user attributes in each area. As described above, the attributes of the user 10 who is the subject are set to be in their twenties, have a female gender, and have a place of residence in a company in the city center. It is considered that there is a possibility that the area 2 attracts the eyes of the user 10 as compared with other areas due to such an attribute (background) of the user 10. Further, due to the attributes of the user 10, it is considered that the impression of "innovative" and "like" is good in the area 2, while the appeal of "delicious" is weak.
 このように、図5A、図5Bに示すフローチャートでは、評価対象物のデザイン案画像30を1つ取得し、デザイン案画像30の特徴に基づいて、推定誘目度を算出する。また、デザイン案画像30を表示装置400に表示させ、評価対象物のデザインに関する複数の感性質問項目41~43をユーザー10に提供する。続いて、表示装置400に表示されたデザイン案画像30に対するユーザー10の視線に関する情報を取得し、当該情報に基づいて、画像の測定誘目度を算出する。そして、推定誘目度と測定誘目度との差分である誘目度差分を算出し、誘目度差分と複数の感性質問項目との関連性を抽出する。 As described above, in the flowcharts shown in FIGS. 5A and 5B, one design proposal image 30 of the evaluation target is acquired, and the estimated attraction degree is calculated based on the characteristics of the design proposal image 30. Further, the design proposal image 30 is displayed on the display device 400, and a plurality of sensitivity question items 41 to 43 regarding the design of the evaluation object are provided to the user 10. Subsequently, information regarding the line of sight of the user 10 with respect to the design proposal image 30 displayed on the display device 400 is acquired, and the measurement attraction degree of the image is calculated based on the information. Then, the attractiveness difference, which is the difference between the estimated attractiveness and the measured attractiveness, is calculated, and the relationship between the attractiveness difference and the plurality of Kansei question items is extracted.
 なお、図5A、図5Bに示すフローチャートの処理を実施する前に、ダミー画像(不図示)およびダミーの感性質問項目をディスプレイ410に表示させて、被験者にモニター試験の内容を練習させる処理を含むように構成してもよい。これにより、被験者がモニター試験の内容に慣れた状態で本番のモニター試験に臨むことができる。なお、ダミー画像を用いた練習の測定結果等は、本番のモニター試験の測定結果には加えられない。 Before performing the processing of the flowcharts shown in FIGS. 5A and 5B, a process of displaying a dummy image (not shown) and a dummy sensitivity question item on the display 410 and having the subject practice the contents of the monitor test is included. It may be configured as follows. As a result, the subject can face the actual monitor test while being accustomed to the contents of the monitor test. It should be noted that the measurement results of the practice using the dummy image are not added to the measurement results of the actual monitor test.
 また、上述の例では、感性情報取得部359は、デザイン案に関するユーザー10の評価(点数)を取得し、ユーザー10の評価に基づいて、感性情報を算出する場合について説明したが、感性情報の算出方法は、このような場合に限定されない。感性情報取得部359は、感性質問項目に対して、ユーザー10の回答時間を取得し、回答時間に基づいて、感性情報を算出するように構成してもよい。感性情報は、例えば、回答時間の逆数として算出されうる。 Further, in the above example, the case where the Kansei information acquisition unit 359 acquires the evaluation (score) of the user 10 regarding the design proposal and calculates the Kansei information based on the evaluation of the user 10 has been described. The calculation method is not limited to such cases. The Kansei information acquisition unit 359 may be configured to acquire the answer time of the user 10 for the Kansei question item and calculate the Kansei information based on the answer time. Sensitivity information can be calculated, for example, as the reciprocal of the response time.
 また、デザイン評価システム100に搭載したカメラでユーザー10の顔の表情や姿勢を撮像し、撮像された画像をパターン認識等の画像処理を施して、感性質問項目に対してのユーザー10の感性情報を推定することもできる。これにより、回答によってユーザー10にかける負担を抑制しつつ、感性情報を取得することが可能である。 Further, the camera mounted on the design evaluation system 100 captures the facial expression and posture of the user 10, and the captured image is subjected to image processing such as pattern recognition to perform the sensitivity information of the user 10 for the sensitivity question item. Can also be estimated. As a result, it is possible to acquire Kansei information while suppressing the burden on the user 10 depending on the answer.
 また、多数の被験者についてデータを集める上では、インターネット等を利用して、ネット環境下でモニター試験を実施することが有効であると考えられる。 In addition, in collecting data for a large number of subjects, it is considered effective to conduct a monitor test in an internet environment using the Internet or the like.
 (変形例1)
 図19は誘目度差分と複数の感性質問項目41~43との関連性を、重回帰分析を使用して抽出する場合の各変数について例示する図であり、図20は図19における重回帰分析によって算出された回帰係数β~βを例示する図である。
(Modification 1)
FIG. 19 is a diagram illustrating each variable when the relationship between the attractiveness difference and the plurality of Kansei question items 41 to 43 is extracted using the multiple regression analysis, and FIG. 20 is a diagram illustrating the multiple regression analysis in FIG. It is a figure which illustrates the regression coefficient β 2 to β 3 calculated by.
 図18A~図18Cでは、複数の感性質問項目41~43について、誘目度差分と誘目度換算値との相関関係をグラフで表示する場合について例示したが、本実施形態はこのような場合に限定されない。例えば、誘目度差分と、少なくとも1つの感性質問項目(およびユーザー10の属性)との関連性を抽出するために、統計学的手法(例えば、単回帰分析や重回帰分析)を採用できる。 In FIGS. 18A to 18C, the case where the correlation between the attractiveness difference and the attractiveness conversion value is displayed in a graph for the plurality of Kansei question items 41 to 43 is illustrated, but this embodiment is limited to such a case. Not done. For example, a statistical method (eg, simple regression analysis or multiple regression analysis) can be adopted to extract the relationship between the attractiveness difference and at least one sensitivity question item (and the attribute of the user 10).
 統計学的手法では、感性質問項目(およびユーザー10の属性)を説明変数とし、誘目度差分を目的変数として、統計処理により、説明変数と目的変数との相関関係を解析する。 In the statistical method, the sensitivity question item (and the attribute of the user 10) is used as the explanatory variable, and the attractiveness difference is used as the objective variable, and the correlation between the explanatory variable and the objective variable is analyzed by statistical processing.
 例えば、重回帰分析を行う場合について説明する。図19に示すように、被験者の人数をn、各感性質問項目に対する誘目度換算値をXki、εを誤差項、回帰係数をβ、各エリアの誘目度差分をYとおくと、重回帰方程式は、Y=β+β2i+β3i+…+βki+ε (i=1,2,3,‥,n)で表される。ここで、β、β、β、‥、βは、各説明変数の影響を表しており、この回帰係数の大小で誘目度差分への影響を定量的に表記できる。 For example, a case where multiple regression analysis is performed will be described. As shown in FIG. 19, when the number of subjects is n, the attractiveness conversion value for each sensitivity question item is X ki , ε is an error term, the regression coefficient is β k , and the attractiveness difference in each area is Y i . The multiple regression equation is expressed by Y i = β 1 + β 2 X 2i + β 3 X 3i + ... + β k X ki + ε i (i = 1, 2, 3, ..., N). Here, β 1 , β 2 , β 3 , ..., β k represent the influence of each explanatory variable, and the influence on the attractiveness difference can be quantitatively expressed by the magnitude of this regression coefficient.
 この重回帰方程式では、未知であるk個の回帰係数が含まれているが、最小二乗法を用いることにより、それらの回帰係数を算出できる。また、誤差項を、ε=Y-(β+β2i+β3i+…+βki)で表し、その平方和S=Σε が最小となるεを算出する。Sを最小にするために、各説明変数の一次偏微分をゼロとおいた式を個数分用意する。 This multiple regression equation includes k unknown regression coefficients, but these regression coefficients can be calculated by using the least squares method. Further, the error term is expressed by ε i = Y i − (β 1 + β 2 X 2i + β 3 X 3i +… + β k X ki ), and the sum of squares S = Σε i 2 is calculated as the minimum ε i . .. In order to minimize S, prepare the number of equations with the first partial derivative of each explanatory variable set to zero.
 例えば、βについては、上式を変形して
S=Σ{Y-(β+β2i+β3i+…+βki)}
=Σ{Y -2Y(β+β2i+β3i+…+βki)+(β+β2i+β3i+…+βki
=Σ{Y -2Y(β+β2i+β3i+…+βki)+β +2β(β2i+β3i+…+βki)+(β2i+β3i+…+βki}となる。
For example, for β 1 , S = Σ {Y i-1 + β 2 X 2i + β 3 X 3i + ... + β k X ki )} 2 by modifying the above equation.
= Σ {Y i 2 -2Y i1 + β 2 X 2i + β 3 X 3i +… + β k X ki ) + (β 1 + β 2 X 2i + β 3 X 3i +… + β k X ki ) 2 }
= Σ {Y i 2 -2Y i1 + β 2 X 2i + β 3 X 3i +… + β k X ki ) + β 1 2 + 2β 12 X 2i + β 3 X 3i +… + β k X ki ) + ( β 2 X 2i + β 3 X 3i + ... + β k X ki ) 2 }.
 βについての一次偏微分式は、δS/δβ=Σ{-2Y+2β+2(βX2+…+βki)}=-2Σ{Y-β-(β2i+…+βki)}であり、この一次偏微分をゼロとする。 The first partial derivative for β 1 is δS / δβ 1 = Σ {-2Y i + 2β 1 + 2 (β 2 X2 i + ... + β k X ki )} = -2Σ {Y i1-2 X) 2i + ... + β k X ki )}, and this first partial derivative is zero.
 同様にβについての一次偏微分式は、δS/δβ=2Σ{X22β2-X(Y-β-β-…-βki)}であり、この一次偏微分式をゼロとする。結果として変数k個のk次の連立方程式となり、β、β、‥、βを一義的に求めることができる。したがって、図20に示すように、これらの連立方程式を解くことにより、感性質問項目の影響度合いを示す回帰係数β~βを導き出すことが可能となる。 Similarly, the first partial differential equation for β 2 is δS / δβ 2 = 2Σ {X 22 β2-X 2 (Y-β 13 X 3 -...- β k X ki )}, and this first-order partial differential. Let the differential formula be zero. As a result, a system of equations of order k with k variables is obtained, and β 1 , β 2 , ..., Β k can be uniquely obtained. Therefore, as shown in FIG. 20, by solving these simultaneous equations, it is possible to derive regression coefficients β 2 to β 4 indicating the degree of influence of the sensitivity question items.
 (変形例2)
 感性質問項目(およびユーザー10の属性)を説明変数とし、誘目度差分を目的変数として、機械学習により、説明変数と目的変数との関係を解析する手法(以下、「機械学習的手法」という)を採用することもできる。
(Modification 2)
A method of analyzing the relationship between an explanatory variable and an objective variable by machine learning, with the sensitivity question item (and the attribute of user 10) as the explanatory variable and the attractiveness difference as the objective variable (hereinafter referred to as "machine learning method"). Can also be adopted.
 機械学習的手法では、感性抽出部361は、感性情報(およびユーザー10の属性に関する情報)を説明変数とし、誘目度差分を目的変数として教師あり機械学習を行った機械学習モデルを有し、この機械学習モデルを使用して、誘目度差分と、複数の感性質問項目との関連性を抽出する。 In the machine learning method, the sensitivity extraction unit 361 has a machine learning model in which supervised machine learning is performed using the sensitivity information (and information about the attributes of the user 10) as an explanatory variable and the attractiveness difference as an objective variable. A machine learning model is used to extract the relationship between the attractiveness difference and multiple sensitivity question items.
 (学習装置)
 図21は、誘目度差分と、感性情報(およびユーザー10の属性に関する情報)とに基づいて、機械学習モデルを生成する学習装置370を例示する概略ブロック図である。制御装置300は、CPU310が学習プログラムを実行することで、学習装置370として機能する。
(Learning device)
FIG. 21 is a schematic block diagram illustrating a learning device 370 that generates a machine learning model based on the difference in attractiveness and the sensitivity information (and information on the attributes of the user 10). The control device 300 functions as a learning device 370 by the CPU 310 executing a learning program.
 学習装置370は、学習データ記憶部371およびモデル生成部372を有する。学習データ記憶部371は、n人の被験者にモニター試験を実施して取得されたn組の説明変数および目的変数を含む学習データ(教師データ)を記憶する。学習データは、例えば、説明変数としての感性情報と、目的変数としての誘目度差分と、を含みうる。また、ユーザー10の属性に関する情報を説明変数としてさらに含んでもよい。 The learning device 370 has a learning data storage unit 371 and a model generation unit 372. The learning data storage unit 371 stores learning data (teacher data) including n sets of explanatory variables and objective variables acquired by performing a monitor test on n subjects. The learning data may include, for example, sensitivity information as an explanatory variable and attractiveness difference as an objective variable. Further, information about the attribute of the user 10 may be further included as an explanatory variable.
 モデル生成部372は、学習データ記憶部371に記憶されている学習データに基づいて、誘目度差分と感性情報(およびユーザー10の属性)との関係を機械学習する機械学習モデルを生成する。生成された機械学習モデルは、制御装置300の感性抽出部361に送信され、誘目度差分と、複数の感性質問項目41~43との関連性を抽出する際に使用されうる。 The model generation unit 372 generates a machine learning model that machine-learns the relationship between the attractiveness difference and the sensitivity information (and the attribute of the user 10) based on the learning data stored in the learning data storage unit 371. The generated machine learning model is transmitted to the Kansei extraction unit 361 of the control device 300, and can be used in extracting the relationship between the attractiveness difference and the plurality of Kansei question items 41 to 43.
 十分に多い数の被験者によるモニター試験を行い、学習データを蓄積することにより、特定の感性質問項目に対しての誘目度差分換算値の推定精度を向上できる。その結果、被験者によるモニター試験を経ずに、デザイン案に関して、ペルソナの感性影響をシミュレーションできる。 By conducting a monitor test with a sufficiently large number of subjects and accumulating learning data, it is possible to improve the estimation accuracy of the attractiveness difference conversion value for a specific Kansei question item. As a result, it is possible to simulate the emotional effect of the persona on the design proposal without going through a monitor test by the subject.
 また、学習装置370を利用して、ある購買ターゲット層に対して、ペルソナおよび感性情報等の条件を指定すると、予め多数のデザインを保存しているデザイン・データベースの中から、指定された条件に近いデザイン案を複数抽出するシステムを実現できる。商品パッケージのデザイン案を担当するデザイナーは、これら複数のデザインを参考にすることにより、新たなデザイン案を作成したり、デザイン案を変更したりする際に活用できる。さらに、デザイン案の変更案をシステム側で複数パターン作成して、デザイナーに提示するという使用方法も考えられる。 In addition, when conditions such as persona and sensibility information are specified for a certain purchase target group using the learning device 370, the specified conditions are selected from the design database in which a large number of designs are stored in advance. It is possible to realize a system that extracts multiple design proposals that are close to each other. The designer in charge of the design proposal of the product package can utilize these multiple designs as a reference when creating a new design proposal or changing the design proposal. Furthermore, it is conceivable to create multiple patterns on the system side to change the design proposal and present it to the designer.
 以上で説明した本実施形態の制御装置300は、下記の効果を奏する。 The control device 300 of the present embodiment described above has the following effects.
 本実施形態の制御装置300は、デザイン案画像30に関して、ユーザー10の感性に基づく誘目度を算出し、当該誘目度と、デザイン案画像30に関する複数の感性質問項目41~43との関連性を抽出する。したがって、デザイン案画像30に関するユーザー10の感性を可視化できる。 The control device 300 of the present embodiment calculates the degree of attraction based on the sensibility of the user 10 with respect to the design proposal image 30, and determines the relationship between the degree of attraction and the plurality of sensibility question items 41 to 43 regarding the design proposal image 30. Extract. Therefore, the user 10's sensibility regarding the design proposal image 30 can be visualized.
 また、商品パッケージのデザインを考案したデザイナーは、制御装置300によって抽出された上記関連性に基づいて、商品のブランドオーナーから求められる感性目標に対し、改善すべき方向性を判断できる。 In addition, the designer who devised the design of the product package can determine the direction to be improved with respect to the sensitivity target required by the brand owner of the product based on the above-mentioned relationship extracted by the control device 300.
 また、購入ターゲット層となるペルソナを念頭において、複数の商品パッケージデザイン案の中から好ましいデザイン案が新商品の企画段階で決定されるが、抽出された上記関連性は、商品のブランドオーナーがデザイン案を決定する際にも役立てられうる。例えば、商品パッケージデザインにおいて、ある感性質問項目に対して、デザイン案画像上において人の感性に基づく誘目度が高くなる場所はどこなのか、異なる感性質問項目において、それらの誘目度がどう変化するのかについての比較が容易になる。したがって、商品のマーケティングに活用する際に、購買ターゲット層に対して訴求したい感性ポイントが、複数のデザイン案を用意した段階で明確になっている場合、複数デザイン案の中からどれを選択すれば良いのかを判断するための材料とすることができる。 Also, with the persona that is the purchase target group in mind, a preferable design proposal is decided at the planning stage of the new product from multiple product package design proposals, but the above-mentioned extracted relationships are designed by the product brand owner. It can also be useful when deciding on a plan. For example, in product package design, where is the place where the degree of attraction based on human sensibilities is high on the design proposal image for a certain sensibility question item, and how the degree of attraction changes in different sensibility question items. It will be easier to compare. Therefore, when using it for product marketing, if the sensitive points that you want to appeal to the purchasing target group are clear at the stage of preparing multiple design proposals, which one should be selected from multiple design proposals? It can be used as a material for judging whether it is good or not.
 また、商品の販売戦略において設定されるペルソナに応じて、各感性質問項目が人の感性に基づく誘目度へ与える影響の大小を把握できるので、購買ターゲット層のペルソナを考慮してデザイン案を選定する際の指標を得ることができる。 In addition, since it is possible to grasp the magnitude of the influence that each sensitivity question item has on the degree of attraction based on the human sensitivity according to the persona set in the product sales strategy, the design proposal is selected in consideration of the persona of the purchasing target group. You can get an index when you do.
 (第2の実施形態)
 第1の実施形態では、1つの評価対象物の画像に関して、誘目度差分(すなわちユーザーの感性に基づく誘目度)と、複数の感性質問項目との関連性を抽出する場合について説明した。第2の実施形態では、複数の評価対象物の画像に関して、誘目度差分と、複数の感性質問項目との関連性を抽出する場合について説明する。なお、以下では、説明の重複を避けるため、第1の実施形態と同じ構成については詳細な説明を省略する。
(Second embodiment)
In the first embodiment, a case has been described in which the relationship between the attractiveness difference (that is, the attractiveness based on the user's sensibility) and the plurality of sensibility question items is extracted with respect to the image of one evaluation object. In the second embodiment, a case where the relationship between the attractiveness difference and the plurality of Kansei question items is extracted with respect to the images of the plurality of evaluation objects will be described. In the following, in order to avoid duplication of description, detailed description of the same configuration as that of the first embodiment will be omitted.
 図22は第2の実施形態に係るデザイン評価システム100を上方から見た模式図であり、図23A,図23Bは第2の実施形態に係る制御装置300のデザイン評価方法を例示するフローチャートである。図23A、図23Bのフローチャートの処理は、CPU310がデザイン評価プログラムを実行することにより実現される。また、図24はデザイン案1を例示する模式図であり、図25A,図25Bは各々デザイン案1,2の画像のエリア設定を例示する模式図であり、図26A,図26Bは各々デザイン案1,2の画像の顕著性マップを例示する模式図である。また、図27A,図27Bはデザイン案1,2の画像のエリア1~4における推定誘目度を例示する図であり、図28はデザイン案1,2の画像を標準状態で表示した場合を例示する模式図である。また、図29はデザイン案1,2の画像を視ているユーザーの視線位置の分布を例示する模式図であり、図30,図31はデザイン案1,2の画像のエリア1~4における測定誘目度、誘目度差分を各々例示する模式図である。また、図32A~図32Cはデザイン案1,2の画像、および各感性質問項目の表示を例示する模式図であり、図33A~図33Cは各感性質問項目を表示した場合のユーザーの視線位置の分布を例示する模式図である。また、図34A~図34Cは各感性質問項目に対する測定誘目度を例示する模式図であり、図35は各感性質問項目について、デザイン案を選択したときの反応強度の算出結果を例示する図である。また、図36は各感性質問項目について、デザイン案1,2の画像のエリア1~4における誘目度換算値を例示するグラフであり、図37A~図37Cは各感性質問項目について、誘目度差分と誘目度換算値との相関関係を例示するグラフである。 22 is a schematic view of the design evaluation system 100 according to the second embodiment as viewed from above, and FIGS. 23A and 23B are flowcharts illustrating a design evaluation method of the control device 300 according to the second embodiment. .. The processing of the flowcharts of FIGS. 23A and 23B is realized by the CPU 310 executing the design evaluation program. Further, FIG. 24 is a schematic diagram illustrating the design proposal 1, FIGS. 25A and 25B are schematic views illustrating the area setting of the images of the design proposals 1 and 2, respectively, and FIGS. 26A and 26B are design proposals, respectively. It is a schematic diagram which illustrates the saliency map of the image of 1 and 2. 27A and 27B are diagrams illustrating the estimated attractiveness in areas 1 to 4 of the images of the design proposals 1 and 2, and FIG. 28 illustrates the case where the images of the design proposals 1 and 2 are displayed in the standard state. It is a schematic diagram. Further, FIG. 29 is a schematic diagram illustrating the distribution of the line-of-sight position of the user who is viewing the images of the design proposals 1 and 2, and FIGS. 30 and 31 are measurements in areas 1 to 4 of the images of the design proposals 1 and 2. It is a schematic diagram which exemplifies each of the attractiveness degree and the attractiveness degree difference. Further, FIGS. 32A to 32C are schematic views illustrating the images of the design proposals 1 and 2 and the display of each sensitivity question item, and FIGS. 33A to 33C are the line-of-sight positions of the user when each sensitivity question item is displayed. It is a schematic diagram which illustrates the distribution of. Further, FIGS. 34A to 34C are schematic views illustrating the degree of attraction for measurement for each sensitivity question item, and FIG. 35 is a diagram illustrating the calculation result of the reaction intensity when a design proposal is selected for each sensitivity question item. be. Further, FIG. 36 is a graph illustrating the attractiveness conversion values in the areas 1 to 4 of the images of the design proposals 1 and 2 for each sensitivity question item, and FIGS. 37A to 37C show the attraction degree difference for each sensitivity question item. It is a graph exemplifying the correlation between and the attractiveness conversion value.
 図22に示すように、本実施形態では、2つの評価対象物の画像31,32と、これら2つの評価対象物のデザインに関連する感性質問項目(例えば「斬新」)41とを同時にユーザー10に提示する。ユーザー10は、評価対象物の複数のデザイン案の中から感性質問項目にふさわしいと思われるデザイン案を自身の感性に従って選択する。 As shown in FIG. 22, in the present embodiment, the user 10 simultaneously uses the images 31 and 32 of the two evaluation objects and the sensitivity question item (for example, “innovative”) 41 related to the design of the two evaluation objects. Present to. The user 10 selects a design proposal that seems to be suitable for the sensitivity question item from a plurality of design proposals of the evaluation target according to his / her own sensitivity.
 まず、図23Aに示すように、評価対象物の画像を取得する(ステップS201)。本実施形態では、第1の実施形態で使用したデザイン案(デザイン案2)に加えて、図24に示すデザイン案(デザイン案1)を使用する。デザイン案1は、「食材」の具体的な調理例をイラストで表示したデザイン案である。画像取得部351は、デザイン案1画像31およびデザイン案2画像32(以下、「デザイン案1、2画像31,32」とも書く)を取得する。 First, as shown in FIG. 23A, an image of the evaluation target is acquired (step S201). In this embodiment, in addition to the design proposal (design proposal 2) used in the first embodiment, the design proposal (design proposal 1) shown in FIG. 24 is used. The design plan 1 is a design plan in which a concrete cooking example of "ingredients" is displayed by an illustration. The image acquisition unit 351 acquires the design proposal 1 image 31 and the design proposal 2 image 32 (hereinafter, also referred to as “ design proposal 1, 2 image 31, 32”).
 次に、ユーザー10の属性に関する情報を取得する(ステップS202)。属性情報取得部352は、例えば、補助記憶部340からユーザー10の属性に関する情報を読み取る。 Next, the information about the attribute of the user 10 is acquired (step S202). The attribute information acquisition unit 352 reads, for example, information about the attributes of the user 10 from the auxiliary storage unit 340.
 次に、評価対象物の画像の各エリアについて誘目度を推定する(ステップS203)。図25A,図25Bに示すように、誘目度推定部353は、デザイン案1、2画像31,32の各々に対して、複数のエリア(例えば、エリア1~4)を設定する。また、図26A,図26Bに示すように、誘目度推定部353は、デザイン案1、2画像31,32について顕著性マップを生成し、RAM320に保存する。図27Aには、デザイン案1画像31のエリア1~4における推定誘目度が例示されている。また、図27Bには、デザイン案2画像32のエリア1~4における推定誘目度が例示されている。 Next, the degree of attraction is estimated for each area of the image of the evaluation target (step S203). As shown in FIGS. 25A and 25B, the attraction degree estimation unit 353 sets a plurality of areas (for example, areas 1 to 4) for each of the design proposals 1 and 2 images 31 and 32. Further, as shown in FIGS. 26A and 26B, the attraction degree estimation unit 353 generates a saliency map for the design proposals 1 and 2 and the images 31 and 32, and stores the saliency map in the RAM 320. FIG. 27A exemplifies the estimated degree of attraction in areas 1 to 4 of the design proposal 1 image 31. Further, FIG. 27B illustrates the estimated degree of attraction in areas 1 to 4 of the design proposal 2 image 32.
 次に、評価対象物の画像を表示する(ステップS204)。図28に示すように、表示制御部356は、標準状態において、表示装置400のディスプレイ410にデザイン案1、2画像31,32を、所定間隔をあけて横に並べて表示させる。 Next, the image of the evaluation target is displayed (step S204). As shown in FIG. 28, in the standard state, the display control unit 356 causes the display 410 of the display device 400 to display the design proposals 1 and 2 images 31 and 32 side by side at predetermined intervals.
 次に、表示装置400に表示された評価対象物の画像に対するユーザー10の視線に関する情報を取得する(ステップS205)。視線測定装置200は、表示装置400に表示されたデザイン案1、2画像31,32に対するユーザー10の視線を測定し、視線に関する情報を視線情報取得部358に送信する。 Next, the information regarding the line of sight of the user 10 with respect to the image of the evaluation object displayed on the display device 400 is acquired (step S205). The line-of-sight measuring device 200 measures the line-of-sight of the user 10 with respect to the design proposals 1 and 2 images 31 and 32 displayed on the display device 400, and transmits information about the line-of-sight to the line-of-sight information acquisition unit 358.
 図29に示すように、視線位置の分布は、ヒートマップで例示されうる。例えば、デザイン案1画像31のエリア3の調理例のイラスト中央部には、視線のやや強い集中帯C1がある。デザイン案2画像32には、エリア1の文字上、エリア2の「砂時計」イラストの上部、および下部に各々強い集中帯C2~C4がある。 As shown in FIG. 29, the distribution of line-of-sight positions can be exemplified by a heat map. For example, in the central part of the illustration of the cooking example of the area 3 of the design proposal 1 image 31, there is a concentration zone C1 having a slightly strong line of sight. In the design proposal 2 image 32, there are strong concentration zones C2 to C4 on the characters of the area 1 and at the upper part and the lower part of the “hourglass” illustration of the area 2, respectively.
 次に、評価対象物の画像の各エリアについて誘目度を測定する(ステップS206)。誘目度測定部354は、視線に関する情報に基づいて、デザイン案1、2画像31,32の各エリアについて測定誘目度を算出する。図30には、デザイン案1、2画像31,32のエリア1~4における測定誘目度が例示されている。 Next, the degree of attraction is measured for each area of the image of the evaluation target (step S206). The attraction degree measurement unit 354 calculates the measurement attraction degree for each area of the design proposals 1 and 2 images 31 and 32 based on the information regarding the line of sight. FIG. 30 illustrates the degree of attraction for measurement in areas 1 to 4 of the design proposals 1 and 2 images 31 and 32.
 次に、各エリアについて測定誘目度と推定誘目度との差分を算出する(ステップS207)。差分算出部355は、デザイン案1、2画像31,32の各エリアについて測定誘目度と推定誘目度との差分(誘目度差分)を算出する。図31には、デザイン案1、2画像31,32のエリア1~4における誘目度差分が例示されている。 Next, the difference between the measured attractiveness and the estimated attractiveness is calculated for each area (step S207). The difference calculation unit 355 calculates the difference between the measured attraction degree and the estimated attraction degree (attraction degree difference) for each area of the design proposals 1 and 2 images 31 and 32. FIG. 31 illustrates the difference in attractiveness in areas 1 to 4 of the design proposals 1 and 2 images 31 and 32.
 次に、評価対象物のデザイン案に関する複数の感性質問項目をユーザーに提供する(ステップS208)。本実施形態では、制御装置300は、デザイン案1およびデザイン案2に関して、「斬新」、「おいしそう」、および「好き」の3つの感性質問項目41~43をユーザー10に提供する。第1の実施形態と同様に、これらの感性質問項目は、表示装置400のディスプレイ410に所定の時間間隔(例えば、10秒)をあけて1つずつ順番に表示される(図32A~図32Cを参照)。図32Aに示す例では、デザイン案1画像31とデザイン案2画像32との間の位置に感性質問項目「斬新」41が表示されている。 Next, a plurality of sensibility question items regarding the design proposal of the evaluation target are provided to the user (step S208). In the present embodiment, the control device 300 provides the user 10 with three sensitivity question items 41 to 43 of "innovative", "delicious", and "like" with respect to the design proposal 1 and the design proposal 2. Similar to the first embodiment, these Kansei question items are sequentially displayed one by one on the display 410 of the display device 400 at predetermined time intervals (for example, 10 seconds) (FIGS. 32A to 32C). See). In the example shown in FIG. 32A, the sensitivity question item "innovative" 41 is displayed at a position between the design proposal 1 image 31 and the design proposal 2 image 32.
 表示制御部356は、感性質問項目をディスプレイ410に表示させる際に、ユーザー10に対して、表示される各感性質問項目について、例えば、「表示される言葉に合うデザイン案をデザイン案1およびデザイン案2のうちから選択してください」のように、デザイン案の選択を行うことを依頼する旨のメッセージを併せて表示する。デザイン案の選択は、所定の制限時間(例えば、10秒)以内に行われるように設定されうる。 When the display control unit 356 displays the sensitivity question items on the display 410, the display control unit 356 asks the user 10 about each sensitivity question item to be displayed, for example, "Design proposal 1 and design proposal matching the displayed words. A message asking you to select a design proposal is also displayed, such as "Please select from proposal 2". The selection of the proposed design can be set to be done within a predetermined time limit (eg, 10 seconds).
 なお、デザイン案1画像31とデザイン案2画像32との中間の表示位置に感性質問項目を表示することにより、いずれかのデザイン案に対して心理的なバイアスがかかることを防止しているが、感性質問項目の位置は中間の表示位置に限定されない。 By displaying the sensitivity question item at the display position between the design proposal 1 image 31 and the design proposal 2 image 32, it is possible to prevent a psychological bias from being applied to any of the design proposals. , The position of the sensitivity question item is not limited to the intermediate display position.
 次に、表示装置400に表示された評価対象物の画像に対するユーザー10の視線に関する情報を取得する(ステップS209)。視線測定装置200は、表示装置400に表示されたデザイン案1、2画像31,32に対するユーザー10の視線を測定し、視線に関する情報を視線情報取得部358に送信する。 Next, the information regarding the line of sight of the user 10 with respect to the image of the evaluation object displayed on the display device 400 is acquired (step S209). The line-of-sight measuring device 200 measures the line-of-sight of the user 10 with respect to the design proposals 1 and 2 images 31 and 32 displayed on the display device 400, and transmits information about the line-of-sight to the line-of-sight information acquisition unit 358.
 ユーザー10は、各感性質問項目について、自身の感性に従ってデザイン案を選択する。具体的には、ユーザー10は、これらのデザイン案の優劣を付けるため、入力装置500のキーボードやマウスを使用して、デザイン案1およびデザイン案2のうちのいずれかのデザイン案を選択する。 User 10 selects a design proposal according to his / her own sensibility for each sensibility question item. Specifically, the user 10 selects one of the design proposals 1 and the design proposal 2 by using the keyboard and the mouse of the input device 500 in order to determine the superiority or inferiority of these design proposals.
 入力装置500は、各感性質問項目が表示されてから、ユーザー10がキーボードやマウスを使用してデザイン案1、またはデザイン案2を選択するまでの回答時間を計測する。また、視線情報取得部358は、表示装置400に表示されたデザイン案1、2画像31,32に対するユーザーの視線に関する情報を取得し、ユーザー10の回答時間内におけるユーザー10の視線の移動履歴を計測する。移動履歴は、視線位置の分布として表示装置400のディスプレイ410に表示される(図33A~図33Cを参照)。 The input device 500 measures the response time from when each sensitivity question item is displayed until the user 10 selects the design proposal 1 or the design proposal 2 using the keyboard or the mouse. Further, the line-of-sight information acquisition unit 358 acquires information on the user's line of sight with respect to the design proposals 1 and 2 images 31 and 32 displayed on the display device 400, and obtains the movement history of the user's line of sight within the response time of the user 10. measure. The movement history is displayed on the display 410 of the display device 400 as a distribution of line-of-sight positions (see FIGS. 33A to 33C).
 例えば、図33Aに示すように、感性質問項目が「斬新」である場合において、デザイン案1画像31のヒートマップには、調理例のイラストの上側の文字が書かれている部分に比較的強い視線の集中帯C5と、下側の文字が書かれている部分に弱い集中帯C6とがある。一方、デザイン案2画像32のヒートマップには、「砂時計」のイラストの上部に弱い集中帯C7があり、下部に強い集中帯C8がある。 For example, as shown in FIG. 33A, when the sensitivity question item is "innovative", the heat map of the design proposal 1 image 31 is relatively strong in the part where the upper character of the illustration of the cooking example is written. There is a concentrated zone C5 of the line of sight and a weak concentrated zone C6 in the lower part where the characters are written. On the other hand, in the heat map of the design proposal 2 image 32, there is a weak concentration zone C7 at the upper part of the illustration of the “hourglass” and a strong concentration zone C8 at the lower part.
 次に、感性情報取得部359は、各感性質問項目について、ユーザー10によるデザイン案に関する選択が完了したか否かを判定する(ステップS210)。判定の結果、選択が完了していない場合(ステップS210:NO)、S209の処理に戻る。 Next, the Kansei information acquisition unit 359 determines whether or not the selection regarding the design proposal by the user 10 has been completed for each Kansei question item (step S210). As a result of the determination, if the selection is not completed (step S210: NO), the process returns to the process of S209.
 一方、選択が完了している場合(ステップS210:YES)、各エリアについて誘目度を測定する(ステップS211)。ユーザー10によるデザイン案に関する選択が完了した後、誘目度測定部354は、各感性質問項目について、視線に関する情報に基づいて、デザイン案1、2画像31,32の測定誘目度を算出する。図34A~図34Cには、各々「斬新」、「おいしそう」、および「好き」に対する、エリア1~4における測定誘目度が例示されている。 On the other hand, when the selection is completed (step S210: YES), the degree of attraction is measured for each area (step S211). After the selection of the design proposal by the user 10 is completed, the attraction degree measuring unit 354 calculates the measurement invitation degree of the design proposals 1, 2 images 31, 32 based on the information on the line of sight for each sensitivity question item. FIGS. 34A to 34C exemplify the degree of attraction of measurement in areas 1 to 4 for "innovative", "delicious", and "like", respectively.
 次に、評価対象物のデザインに関するユーザー10の感性情報を算出する(ステップS212)。感性情報取得部359は、各感性質問項目に対して、ユーザー10の感性情報を算出する。本実施形態では、例えば、回答時間の逆数(以下、「反応強度」ともいう)に比例するように、各感性質問項目に対して感性情報を算出する。これは、回答時間が短いほどユーザー10の感性が感性質問項目によって強く印象付けられている、すなわち感性質問項目とユーザー10の感性とに強いつながりがあると考えられるためである。本実施形態では、例えば、感性情報は、反応強度と等しい値に設定される。図35に示すように、例えば、「斬新」、「おいしそう」、および「好き」について、回答時間が各々3.5秒、8.5秒、および6.9秒であった場合、反応強度は、約2.9、約1.2、および約1.4である。 Next, the sensitivity information of the user 10 regarding the design of the evaluation target is calculated (step S212). The Kansei information acquisition unit 359 calculates the Kansei information of the user 10 for each Kansei question item. In the present embodiment, for example, sensitivity information is calculated for each sensitivity question item so as to be proportional to the reciprocal of the response time (hereinafter, also referred to as “reaction intensity”). This is because the shorter the response time, the stronger the impression of the user 10's sensibility is by the sensibility question item, that is, it is considered that there is a strong connection between the sensibility question item and the sensibility of the user 10. In the present embodiment, for example, the sensitivity information is set to a value equal to the reaction intensity. As shown in FIG. 35, for example, when the response times for "innovative", "delicious", and "like" were 3.5 seconds, 8.5 seconds, and 6.9 seconds, respectively, the reaction intensity. Are about 2.9, about 1.2, and about 1.4.
 次に、各エリアについて誘目度換算値を算出する(ステップS213)。感性情報取得部359は、各感性質問項目に対して、各エリアにおける測定誘目度と感性情報とに基づいて誘目度換算値を算出する。本実施形態では、誘目度換算値は、各エリアにおける測定誘目度と感性情報との積で表される。図36には、各感性質問項目が提示されたときの各エリアにおける誘目度換算値について例示している。例えば、感性質問項目が「斬新」である場合、デザイン案1画像31に関して、エリア1の測定誘目度は、0.02(図34Aのグレーで示す部分)であり、感性情報(反応時間)は2.9(図35ののグレーで示す部分)であるので、誘目度換算値は、0.02×2.9=0.58≒0.06(図36のグレーで示す部分)となる。 Next, the attractiveness conversion value is calculated for each area (step S213). The Kansei information acquisition unit 359 calculates the Kansei degree conversion value for each Kansei question item based on the measured Kansei degree and the Kansei information in each area. In the present embodiment, the attractiveness conversion value is represented by the product of the measured attractiveness and the sensitivity information in each area. FIG. 36 illustrates the attractiveness conversion value in each area when each sensitivity question item is presented. For example, when the sensitivity question item is "innovative", the measurement attraction degree of the area 1 is 0.02 (the part shown in gray in FIG. 34A) with respect to the design proposal 1 image 31, and the sensitivity information (reaction time) is. Since it is 2.9 (the part shown in gray in FIG. 35), the attractiveness conversion value is 0.02 × 2.9 = 0.58 ≈ 0.06 (the part shown in gray in FIG. 36).
 次に、誘目度差分と複数の感性質問項目41~43との関連性を抽出する(ステップS214)。図37A~図37Cに示すように、感性出力部362は、例えば、誘目度差分と誘目度換算値との相関関係を表すグラフをディスプレイ410に出力する。これにより、デザイン案画像30に関して、複数の感性質問項目41~43に対するユーザー10の感性が可視化される。 Next, the relationship between the difference in attractiveness and the plurality of Kansei question items 41 to 43 is extracted (step S214). As shown in FIGS. 37A to 37C, the sensitivity output unit 362 outputs, for example, a graph showing the correlation between the attractiveness difference and the attractiveness conversion value to the display 410. As a result, the sensibilities of the user 10 for the plurality of sensibility question items 41 to 43 are visualized with respect to the design proposal image 30.
 図37Aに示す例では、感性質問項目が「斬新」である場合、デザイン案1画像31のエリア3と、デザイン案2画像32のエリア2とにおいて、誘目度差分は0.3以上の正値となっている。また、図37B,図37Cの感性質問項目が「おいしそう」および「好き」についても同様である。したがって、図37A~図37Cに示す例では、感性抽出部361は、「斬新」、「おいしそう」および「好き」のいずれに対しても、デザイン案1画像31のエリア3と、デザイン案2画像32のエリア2とでは誘目度差分が0.3以上の比較的大きい正値をとるという関連性を抽出する。また、感性抽出部361は、「斬新」、「おいしそう」および「好き」のいずれに対しても、デザイン案1画像31のエリア3と、デザイン案2画像32のエリア2以外のエリアでは、誘目度差分が概ね0以下であるという関連性を抽出する。 In the example shown in FIG. 37A, when the sensitivity question item is "innovative", the difference in attractiveness between the area 3 of the design proposal 1 image 31 and the area 2 of the design proposal 2 image 32 is a positive value of 0.3 or more. It has become. The same applies to the sensibility question items of FIGS. 37B and 37C for "delicious" and "like". Therefore, in the examples shown in FIGS. 37A to 37C, the sensitivity extraction unit 361 has the area 3 of the design plan 1 image 31 and the design plan 2 for any of "innovative", "delicious" and "like". The relationship with the area 2 of the image 32 that the attractiveness difference takes a relatively large positive value of 0.3 or more is extracted. In addition, the sensitivity extraction unit 361 responds to any of "innovative", "delicious", and "like" in areas other than the area 3 of the design proposal 1 image 31 and the area 2 of the design proposal 2 image 32. The relevance that the attractiveness difference is approximately 0 or less is extracted.
 また、図36に示す例では、感性質問項目が「斬新」、「おいしそう」、および「好き」である場合に対する、デザイン案2画像32のエリア2の誘目度換算値は、各々3.77、0.12、および2.23である。誘目度換算値は、「おいしそう」の場合と比較して、「斬新」および「好き」の場合に高い。すなわち、感性質問項目が「斬新」および「好き」である場合では、誘目度差分が大きい場合、誘目度換算値も大きく、「おいしそう」である場合と比較して、誘目度差分と誘目度換算値との相関が高い。このように、感性抽出部361は、誘目度差分と誘目度換算値とに基づいて、誘目度差分と複数の感性質問項目との関連性を抽出する。これにより、どの感性質問項目に起因して誘目度差分が大きくなっているのかを判定できる。 Further, in the example shown in FIG. 36, the attractiveness conversion value of the area 2 of the design proposal 2 image 32 is 3.77, respectively, when the sensitivity question items are “innovative”, “delicious”, and “like”. , 0.12, and 2.23. The attractiveness conversion value is higher in the case of "innovative" and "like" than in the case of "delicious". That is, when the sensitivity question items are "innovative" and "like", when the attractiveness difference is large, the attractiveness conversion value is also large, and the attractiveness difference and the attractiveness are compared with the case where "it looks delicious". High correlation with converted value. In this way, the sensitivity extraction unit 361 extracts the relationship between the attraction degree difference and the plurality of sensitivity question items based on the attraction degree difference and the attraction degree conversion value. This makes it possible to determine which sensitivity question item causes the difference in the degree of attraction to be large.
 なお、上述の例では、デザイン案1,2の2つのデザイン案を使用する場合を説明したが、デザイン案を3つ以上使用する場合についても適用できる。 In the above example, the case where two design proposals 1 and 2 are used has been described, but the case where three or more design proposals are used can also be applied.
 また、上述の例では、評価対象物の画像が互いに異なるデザインである場合について説明した。しかしながら、このような場合に限定されず、評価対象物の画像が同じデザインであり、一方の画像に対しては加飾印刷が施され、他方の画像に対しては加飾印刷が施されていない場合についても適用できる。この場合、画像取得部351は、加飾印刷が施されている評価対象物の画像と、加飾印刷が施されていない評価対象物の画像と、を取得する。本実施形態では、加飾印刷は、例えば、拍押し印刷や、ニス加工印刷等である。 Further, in the above example, the case where the images of the evaluation objects have different designs from each other has been described. However, the present invention is not limited to such cases, and the images of the evaluation objects have the same design, one image is decoratively printed, and the other image is decoratively printed. It can be applied even if it does not exist. In this case, the image acquisition unit 351 acquires an image of the evaluation object to which decorative printing is applied and an image of the evaluation object to which decorative printing is not applied. In the present embodiment, the decorative printing is, for example, beat printing, varnishing printing, or the like.
 また、選択したデザイン案の、パワーメッセージを変更した場合において、被験者が受ける感性影響を調べたり、箔押し印刷やニス加工印刷等の加飾加工を加えた場合において、高級感に関して被験者が受ける感性影響を調べたりすることが容易となる。 In addition, when the power message of the selected design proposal is changed, the subject's emotional effect is investigated, and when decorative processing such as foil stamping printing or varnishing printing is added, the subject's emotional effect on the sense of quality is obtained. It becomes easy to check.
 また、本実施形態においても、第1の実施形態で説明した統計学的手法、または機械学習的手法によって誘目度差分と複数の感性質問項目との関連性を抽出できることは言うまでもない。このような手法を用いることにより、複数のデザイン案の中から、ある特定の感性質問項目に特化した場合の、デザイン案を推奨することが可能となる。 Further, in this embodiment as well, it goes without saying that the relationship between the attractiveness difference and the plurality of Kansei question items can be extracted by the statistical method or the machine learning method described in the first embodiment. By using such a method, it is possible to recommend a design proposal when specializing in a specific Kansei question item from a plurality of design proposals.
 本発明は、上述した実施形態のみに限定されるものではなく、特許請求の範囲内において、種々改変することができる。 The present invention is not limited to the above-described embodiment, and can be variously modified within the scope of the claims.
 例えば、デザイン評価プログラムは、USBメモリー、フレキシブルディスク、CD-ROM等のコンピューター読み取り可能な記録媒体によって提供されてもよいし、インターネット等のネットワークを介してオンラインで提供されてもよい。この場合、コンピューター読み取り可能な記録媒体に記録されたプログラムは、通常、メモリーやストレージ等に転送され記憶される。また、このデザイン評価プログラムは、例えば、単独のアプリケーションソフトとして提供されてもよいし、サーバーの一機能としてその各装置のソフトウェアに組み込んでもよい。 For example, the design evaluation program may be provided by a computer-readable recording medium such as a USB memory, a flexible disk, or a CD-ROM, or may be provided online via a network such as the Internet. In this case, the program recorded on the computer-readable recording medium is usually transferred to a memory, storage, or the like and stored. Further, this design evaluation program may be provided as a single application software, or may be incorporated into the software of each device as a function of the server.
 また、実施形態においてプログラムにより実行される処理の一部または全部を回路等のハードウェアに置き換えて実行されうる。 Further, in the embodiment, a part or all of the processing executed by the program can be replaced with hardware such as a circuit and executed.
 本出願は、2020年11月6日に出願された日本国特許出願番号2020-185842号に基づいており、その開示内容は、参照により全体として組み入れられている。 This application is based on Japanese Patent Application No. 2020-185842 filed on November 6, 2020, the disclosure of which is incorporated as a whole by reference.
100 デザイン評価システム、
200 視線測定装置、
300 制御部、
310 CPU、
320 RAM、
330 ROM、
340 補助記憶部、
351 画像取得部、
352 属性情報取得部、
353 誘目度推定部、
354 誘目度測定部、
355 差分算出部、
356 表示制御部、
357 表現提供部、
358 視線情報取得部、
359 感性情報取得部、
360 誘目度換算値算出部、
361 感性抽出部、
362 感性出力部、
370 学習装置、
371 学習データ記憶部、
372 モデル生成部、
400 表示装置、
500 入力装置、
600 音声出力装置、
700 通信装置。
100 design evaluation system,
200 line-of-sight measuring device,
300 control unit,
310 CPU,
320 RAM,
330 ROM,
340 auxiliary storage,
351 Image acquisition department,
352 Attribute information acquisition department,
353 Invitation degree estimation department,
354 attraction measurement unit,
355 Difference calculation unit,
356 display control unit,
357 Expression Providing Department,
358 Line-of-sight information acquisition department,
359 Kansei Information Acquisition Department,
360 attraction conversion value calculation unit,
361 Sensitivity Extractor,
362 Sensitive output unit,
370 learning device,
371 Learning data storage unit,
372 model generator,
400 display device,
500 input device,
600 audio output device,
700 communication equipment.

Claims (21)

  1.  評価対象物の画像を少なくとも1つ取得する画像取得部と、
     前記画像の特徴に基づいて、前記画像の誘目度を推定する誘目度推定部と、
     前記画像を表示装置に表示させる表示制御部と、
     前記評価対象物のデザインに関する複数の表現をユーザーに提供する表現提供部と、
     前記表示装置に表示された前記画像に対する前記ユーザーの視線に関する情報を取得する視線情報取得部と、
     前記視線に関する情報に基づいて、前記画像の誘目度を測定する誘目度測定部と、
     前記誘目度推定部によって推定された推定誘目度と、前記誘目度測定部によって測定された測定誘目度との差分を算出する差分算出部と、
     前記表現と前記デザインに関する前記ユーザーの感性との関係性を示す感性情報を取得する感性情報取得部と、
     前記感性情報を用いて前記差分と前記複数の表現との関連性を抽出する感性抽出部と、を有する、デザイン評価装置。
    An image acquisition unit that acquires at least one image of the evaluation target,
    An attraction degree estimation unit that estimates the degree of attraction of the image based on the characteristics of the image,
    A display control unit that displays the image on the display device,
    An expression providing unit that provides users with a plurality of expressions related to the design of the evaluation object,
    A line-of-sight information acquisition unit that acquires information about the user's line-of-sight with respect to the image displayed on the display device, and a line-of-sight information acquisition unit.
    An attractiveness measuring unit that measures the attractiveness of the image based on the information about the line of sight, and an attractiveness measuring unit.
    A difference calculation unit that calculates the difference between the estimated attraction degree estimated by the attraction degree estimation unit and the measurement attraction degree measured by the attraction degree measurement unit.
    The Kansei information acquisition unit that acquires Kansei information indicating the relationship between the expression and the Kansei of the user regarding the design, and the Kansei information acquisition unit.
    A design evaluation device having a sensitivity extraction unit that extracts a relationship between the difference and the plurality of expressions using the sensitivity information.
  2.  前記差分算出部は、前記画像に対して、互いに重ならない複数の領域を設定し、各々の領域において、前記差分を算出し、
     前記感性抽出部は、前記各々の領域において前記差分と前記複数の表現との前記関連性を抽出する、請求項1に記載のデザイン評価装置。
    The difference calculation unit sets a plurality of regions that do not overlap each other with respect to the image, and calculates the difference in each region.
    The design evaluation device according to claim 1, wherein the sensitivity extraction unit extracts the relationship between the difference and the plurality of expressions in each of the regions.
  3.  前記各々の領域における前記測定誘目度と前記感性情報とに基づいて誘目度換算値を算出する誘目度換算値算出部をさらに有し、
     前記感性情報取得部は、前記デザインに関して、前記表現に対する前記ユーザーの回答を受け付け、当該回答に基づいて、前記感性情報を取得し、
     前記感性抽出部は、前記差分と前記誘目度換算値との相関関係に基づいて前記関連性を抽出する、請求項2に記載のデザイン評価装置。
    Further, it has an attraction degree conversion value calculation unit that calculates an attraction degree conversion value based on the measurement attraction degree and the sensitivity information in each of the regions.
    The Kansei information acquisition unit receives the user's response to the expression regarding the design, and acquires the Kansei information based on the answer.
    The design evaluation device according to claim 2, wherein the sensitivity extraction unit extracts the relationship based on the correlation between the difference and the attractiveness conversion value.
  4.  前記ユーザーの属性に関する情報を取得する属性情報取得部をさらに有し、
     前記感性抽出部は、前記各々の領域において、前記差分と前記複数の表現および前記ユーザーの属性に関する情報との関連性を抽出する、請求項3に記載のデザイン評価装置。
    It also has an attribute information acquisition unit that acquires information about the user's attributes.
    The design evaluation device according to claim 3, wherein the sensitivity extraction unit extracts the relationship between the difference and the information regarding the plurality of expressions and the user's attributes in each of the regions.
  5.  前記感性抽出部は、前記複数の表現を説明変数とし、前記差分を目的変数として、統計学的手法により、前記関連性を抽出する、請求項2に記載のデザイン評価装置。 The design evaluation device according to claim 2, wherein the sensitivity extraction unit extracts the relationship by a statistical method using the plurality of expressions as explanatory variables and the difference as an objective variable.
  6.  前記ユーザーの属性に関する情報を取得する属性情報取得部をさらに有し、
     前記感性抽出部は、前記複数の表現および前記ユーザーの属性に関する情報を説明変数とし、前記差分を目的変数として、統計学的手法により、前記関連性を抽出する、請求項2に記載のデザイン評価装置。
    It also has an attribute information acquisition unit that acquires information about the user's attributes.
    The design evaluation according to claim 2, wherein the sensitivity extraction unit extracts the relationship by a statistical method using the information about the plurality of expressions and the attributes of the user as explanatory variables and the difference as the objective variable. Device.
  7.  前記ユーザーの属性に関する情報を取得する属性情報取得部をさらに有し、
     前記感性抽出部は、前記感性情報および前記ユーザーの属性に関する情報を説明変数とし、前記差分を目的変数として機械学習を行った機械学習モデルを使用して前記関連性を抽出する、請求項2に記載のデザイン評価装置。
    It also has an attribute information acquisition unit that acquires information about the user's attributes.
    The second aspect of the present invention, wherein the sensitivity extraction unit extracts the relationship by using a machine learning model in which the sensitivity information and the information related to the user's attributes are used as explanatory variables and machine learning is performed using the difference as an objective variable. Described design evaluation device.
  8.  前記感性抽出部は、複数の前記ユーザーについて、前記関連性を抽出する、請求項1~7のいずれか1項に記載のデザイン評価装置。 The design evaluation device according to any one of claims 1 to 7, wherein the sensitivity extraction unit extracts the relevance of a plurality of the users.
  9.  前記感性情報取得部は、前記デザインに関する前記ユーザーの評価を取得し、前記ユーザーの評価に基づいて、前記感性情報を算出する、請求項1~8のいずれか1項に記載のデザイン評価装置。 The design evaluation device according to any one of claims 1 to 8, wherein the Kansei information acquisition unit acquires the user's evaluation regarding the design and calculates the Kansei information based on the user's evaluation.
  10.  前記感性情報取得部は、前記表現に対して、前記ユーザーによる回答が完了するまでにかかった回答時間を取得し、前記回答時間に基づいて、前記感性情報を算出する、請求項1~8のいずれか1項に記載のデザイン評価装置。 The Kansei information acquisition unit acquires the response time required for the user to complete the response to the expression, and calculates the Kansei information based on the response time, claim 1 to 8. The design evaluation device according to any one of the items.
  11.  前記視線情報取得部は、前記表現提供部によって前記ユーザーへ前記表現が提供されてから、前記感性情報取得部によって前記表現に対する前記ユーザーの回答を受け付けるまで、前記視線に関する情報を連続的に取得する、請求項1~10のいずれか1項に記載のデザイン評価装置。 The line-of-sight information acquisition unit continuously acquires information about the line of sight from the time the expression is provided to the user by the expression providing unit until the user's response to the expression is received by the sensitivity information acquisition unit. , The design evaluation device according to any one of claims 1 to 10.
  12.  前記表示制御部は、前記画像と、前記表現とを同時に前記表示装置に表示させる、請求項1~11のいずれか1項に記載のデザイン評価装置。 The design evaluation device according to any one of claims 1 to 11, wherein the display control unit simultaneously displays the image and the expression on the display device.
  13.  前記表現は、少なくとも言葉を含む、請求項1~12のいずれか1項に記載のデザイン評価装置。 The design evaluation device according to any one of claims 1 to 12, wherein the expression includes at least words.
  14.  前記画像取得部は、前記評価対象物の画像を複数取得し、
     前記誘目度推定部は、各々の前記画像の特徴に基づいて、各々の前記画像の誘目度を推定し、
     前記表示制御部は、各々の前記画像を前記表示装置に表示させ、
     前記誘目度測定部は、前記視線に関する情報に基づいて、各々の前記画像の誘目度を測定する、請求項1~13のいずれか1項に記載のデザイン評価装置。
    The image acquisition unit acquires a plurality of images of the evaluation target, and obtains a plurality of images.
    The attraction degree estimation unit estimates the attraction degree of each image based on the characteristics of each image.
    The display control unit causes the display device to display each of the images.
    The design evaluation device according to any one of claims 1 to 13, wherein the attractiveness measuring unit measures the attractiveness of each of the images based on the information about the line of sight.
  15.  前記画像取得部は、加飾印刷が施されている評価対象物の画像と、前記加飾印刷が施されていない前記評価対象物の画像と、を取得する、請求項14に記載のデザイン評価装置。 The design evaluation according to claim 14, wherein the image acquisition unit acquires an image of the evaluation object to which decorative printing is applied and an image of the evaluation object to which the decorative printing is not applied. Device.
  16.  前記表示装置と、
     前記表示装置に表示されている前記画像に対するユーザーの視線を測定し、前記視線に関する情報を前記視線情報取得部に出力する視線測定装置と、
     請求項1~15のいずれか1項に記載のデザイン評価装置と、を有する、デザイン評価システム。
    With the display device
    A line-of-sight measuring device that measures the user's line of sight with respect to the image displayed on the display device and outputs information about the line of sight to the line-of-sight information acquisition unit.
    A design evaluation system comprising the design evaluation device according to any one of claims 1 to 15.
  17.  評価対象物の画像を少なくとも1つ取得するステップ(a)と、
     前記画像の特徴に基づいて、各々の前記画像の誘目度を推定するステップ(b)と、
     前記画像を表示装置に表示させるステップ(c)と、
     前記評価対象物のデザインに関する複数の表現をユーザーに提供するステップ(d)と、
     前記表示装置に表示された前記画像に対する前記ユーザーの視線に関する情報を取得するステップ(e)と、
     前記視線に関する情報に基づいて、前記画像の誘目度を測定するステップ(f)と、
     前記ステップ(b)において推定された推定誘目度と、前記ステップ(f)において測定された測定誘目度との差分を算出するステップ(g)と、
     前記表現と前記デザインに関する前記ユーザーの感性との関係性を示す感性情報を取得するステップ(h)と、
     前記感性情報を用いて前記差分と前記複数の表現との関連性を抽出するステップ(i)と、を有する、デザイン評価方法。
    Step (a) to acquire at least one image of the evaluation target,
    In step (b) of estimating the degree of attraction of each of the images based on the characteristics of the images,
    In step (c) of displaying the image on the display device,
    Step (d) of providing the user with a plurality of expressions relating to the design of the evaluation object, and
    A step (e) of acquiring information on the user's line of sight with respect to the image displayed on the display device, and
    The step (f) of measuring the degree of attraction of the image based on the information about the line of sight, and
    A step (g) for calculating the difference between the estimated attractiveness estimated in the step (b) and the measured attractiveness measured in the step (f), and the step (g).
    The step (h) of acquiring the sensibility information indicating the relationship between the expression and the sensibility of the user regarding the design, and
    A design evaluation method comprising the step (i) of extracting the relationship between the difference and the plurality of expressions using the sensitivity information.
  18.  前記ステップ(g)では、前記画像に対して、互いに重ならない複数の領域を設定し、各々の領域において、前記差分を算出し、
     前記ステップ(i)では、前記各々の領域において前記差分と前記複数の表現との前記関連性を抽出する、請求項17に記載のデザイン評価方法。
    In the step (g), a plurality of regions that do not overlap each other are set for the image, and the difference is calculated in each region.
    The design evaluation method according to claim 17, wherein in step (i), the relationship between the difference and the plurality of expressions is extracted in each of the regions.
  19.  前記ステップ(i)の前に、前記ユーザーの属性に関する情報を取得するステップ(j)をさらに有し、
     前記ステップ(i)では、前記感性情報および前記ユーザーの属性に関する情報を説明変数とし、前記差分を目的変数として、前記関連性を抽出する、請求項17または18に記載のデザイン評価方法。
    Prior to the step (i), the step (j) for acquiring information regarding the attributes of the user is further included.
    The design evaluation method according to claim 17 or 18, wherein in step (i), the sensitivity information and the information related to the user's attribute are used as explanatory variables, and the difference is used as an objective variable to extract the relationship.
  20.  請求項17~19のいずれか1項に記載のデザイン評価方法に含まれる処理をコンピューターに実行させるためのデザイン評価プログラム。 A design evaluation program for causing a computer to execute the process included in the design evaluation method according to any one of claims 17 to 19.
  21.  説明変数としての感性情報と、目的変数としての誘目度差分を含む学習データを記憶する記憶部と、
     前記記憶部に記憶されている前記学習データに基づいて、前記感性情報と前記誘目度差分との関係を機械学習して、機械学習モデルを生成するモデル生成部と、を有する、学習装置。
    A storage unit that stores sensibility information as an explanatory variable and learning data including an attraction difference as an objective variable.
    A learning device having a model generation unit that generates a machine learning model by machine learning the relationship between the sensitivity information and the attraction degree difference based on the learning data stored in the storage unit.
PCT/JP2021/038376 2020-11-06 2021-10-18 Design evaluating device, design evaluating system, design evaluating method, design evaluating program, and learning device WO2022097457A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022560698A JPWO2022097457A1 (en) 2020-11-06 2021-10-18

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-185842 2020-11-06
JP2020185842 2020-11-06

Publications (1)

Publication Number Publication Date
WO2022097457A1 true WO2022097457A1 (en) 2022-05-12

Family

ID=81457170

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/038376 WO2022097457A1 (en) 2020-11-06 2021-10-18 Design evaluating device, design evaluating system, design evaluating method, design evaluating program, and learning device

Country Status (2)

Country Link
JP (1) JPWO2022097457A1 (en)
WO (1) WO2022097457A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011039778A (en) * 2009-08-11 2011-02-24 Nippon Hoso Kyokai <Nhk> Moving image content evaluation device and computer program

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011039778A (en) * 2009-08-11 2011-02-24 Nippon Hoso Kyokai <Nhk> Moving image content evaluation device and computer program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LIU, HUIYING ET AL.: "Improving Visual Saliency Computing With Emotion Intensity", IEEE TRANSACTIONS ON NEURAL NETWORK AND LEARNING SYSTEMS, vol. 27, no. 6, pages 1201 - 1213, XP011610547, DOI: 10.1109/TNNLS.2016.2553579 *

Also Published As

Publication number Publication date
JPWO2022097457A1 (en) 2022-05-12

Similar Documents

Publication Publication Date Title
Wook Chae et al. Exploring the effect of the human brand on consumers' decision quality in online shopping: An eye‐tracking approach
Casado-Aranda et al. How consumer ethnocentrism modulates neural processing of domestic and foreign products: A neuroimaging study
Kleisner et al. Perceived intelligence is associated with measured intelligence in men but not women
Orquin et al. Areas of interest as a signal detection problem in behavioral eye‐tracking research
Guo et al. Can eye-tracking data be measured to assess product design?: Visual attention mechanism should be considered
Grogan et al. Dress fit and body image: A thematic analysis of women's accounts during and after trying on dresses
Kumar et al. Indian consumers' purchase intention toward a United States versus local brand
Paulins An analysis of customer service quality to college students as influenced by customer appearance through dress during the in-store shopping process
KR20090045301A (en) Systems and methods for product attribute analysis and product recommendation
Wang et al. The cross‐modal interaction between sound frequency and color saturation on consumer's product size perception, preference, and purchase
Manuel et al. Body shape and fit preference in body cathexis and clothing benefits sought for professional African-American women
Calvo-Porral et al. Specialty food retailing: examining the role of products’ perceived quality
KR20190041081A (en) Evaluation system of cognitive ability based on virtual reality for diagnosis of cognitive impairment
Chynał et al. Shopping behaviour analysis using eyetracking and EEG
JP7423994B2 (en) Recommendation device and recommendation method
Hernández et al. Can virtually trying on apparel help in selecting the correct size?
Józsa et al. Find the difference: Eye tracking study on information seeking behavior using an online game
Xiao et al. Mobile marketing interface layout attributes that affect user aesthetic preference: an eye-tracking study
Hsu An integrated-mental brainwave system for analyses and judgments of consumer preference
Jung et al. Evolutionary programming based recommendation system for online shopping
Meyerding Combining eye-tracking and choice-based conjoint analysis in a bottom-up experiment.
WO2022075404A1 (en) Design evaluation device, design evaluation system, design evaluation method, and design evaluation program
WO2022097457A1 (en) Design evaluating device, design evaluating system, design evaluating method, design evaluating program, and learning device
Kim The impact of body image self-discrepancy on body dissatisfaction, fashion involvement, concerns with fit and size of garments, and loyalty intentions in online apparel shopping
JP2002109175A (en) Method and system for diagnosing brand power

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21889010

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022560698

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21889010

Country of ref document: EP

Kind code of ref document: A1