WO2021100515A1 - Information processing device and information processing method - Google Patents

Information processing device and information processing method Download PDF

Info

Publication number
WO2021100515A1
WO2021100515A1 PCT/JP2020/041733 JP2020041733W WO2021100515A1 WO 2021100515 A1 WO2021100515 A1 WO 2021100515A1 JP 2020041733 W JP2020041733 W JP 2020041733W WO 2021100515 A1 WO2021100515 A1 WO 2021100515A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
fragrance
data
information processing
control unit
Prior art date
Application number
PCT/JP2020/041733
Other languages
French (fr)
Japanese (ja)
Inventor
彩 藤田
修二 藤田
直子 石塚
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Priority to JP2021558299A priority Critical patent/JPWO2021100515A1/ja
Publication of WO2021100515A1 publication Critical patent/WO2021100515A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • the present technology relates to an information processing device and an information processing method, and more particularly to an information processing device and an information processing method capable of supporting a user by fragrance.
  • Patent Document 1 In recent years, not only the user experience by sight and hearing according to images and sounds, but also devices that generate scents and appeal to the user's sense of smell have been attracting attention (see, for example, Patent Document 1).
  • This technology was made in view of such a situation, and enables the user to be assisted by the scent.
  • the information processing device of one aspect of the present technology uses a judgment model generated by machine learning using personal data for each user, and discharges a fragrance to support the user with respect to a target estimated from input sensor data. It is an information processing device provided with a control unit for controlling the above.
  • the information processing device uses a judgment model generated by machine learning using personal data for each user to support the user with respect to a target estimated from input sensor data. This is an information processing method that controls the discharge of fragrance for the purpose.
  • the judgment model generated by machine learning using personal data for each user is used to support the user with respect to the target estimated from the input sensor data.
  • the discharge of fragrance is controlled.
  • the information processing device on one aspect of the present technology may be an independent device or an internal block constituting one device.
  • FIG. 1 shows an example of the configuration of an embodiment of an fragrance support system to which the present technology is applied.
  • the fragrance support system 1 is a system that provides scent support to the user.
  • the fragrance support system 1 is composed of terminal devices 10-1 to 10-N (an integer of N: 1 or more) and a server 20, and is connected to each other via a network 50.
  • the terminal device 10-1 is configured as a wearable terminal such as a spectacle-type device having a transmissive display.
  • a pair of transmissive displays for the left eye and the right eye are arranged at the position of the lens attached to the frame in ordinary spectacles, and are worn around the user's eyes. Therefore, in the spectacle-type device, various information can be drawn in the field of view by using AR (Augmented Reality) technology or the like without separating the user's field of view from the real space.
  • AR Augmented Reality
  • the terminal device 10-1 performs a determination process using a learned determination model provided from the server 20 via the network 50, sensor data acquired by the camera unit and the sensor unit, and the like, and based on the determination result. , Control the discharge of fragrance to assist the user.
  • the terminal device 10-1 is not limited to a spectacle-type device, and may be, for example, a wearable terminal in the shape of a pin batch, a contact lens, a kuroko, or the like. Further, the wearable terminal can include, for example, a device such as an electronic tattoo that is attached to a substance such as human skin or clothing.
  • the terminal devices 10-2 to 10-N are configured as wearable terminals such as eyeglass-type devices, respectively, like the terminal device 10-1. In the following description, when it is not necessary to distinguish the terminal devices 10-1 to 10-N, they are referred to as the terminal device 10.
  • the server 20 is composed of one or a plurality of servers that provide various services to the terminal device 10 via the network 50.
  • the server 20 performs a learning process using the learning data to generate a determination model and provides it to the terminal device 10. Further, for example, the server 20 performs a matching process based on data or the like transmitted from the terminal device 10, and provides the matching process result to the terminal device 10.
  • Network 50 includes communication networks such as the Internet, mobile phone networks, and intranets. That is, in the fragrance support system 1 of FIG. 1, as communication, TCP / IP (Transmission Control Protocol / Internet Protocol) communication via the Internet, cellular mobile communication such as LTE-Advanced and 5G, and an access point are used. Communication via wireless LAN (Local Area Network) is adopted.
  • TCP / IP Transmission Control Protocol / Internet Protocol
  • FIG. 2 shows an example of the configuration of the terminal device 10 of FIG.
  • the CPU Central Processing Unit
  • the ROM Read Only Memory
  • the RAM Random Access Memory
  • the CPU 101 controls the operation of each part of the terminal device 10 by executing the program recorded in the ROM 102. Various data are appropriately recorded in the RAM 103.
  • the input / output interface 110 is also connected to the bus 104.
  • An input unit 105, an output unit 106, a storage unit 107, a communication unit 108, and a short-range wireless communication unit 109 are connected to the input / output interface 110.
  • the input unit 105 supplies various input data to each unit including the CPU 101 via the input / output interface 110.
  • the input unit 105 has an operation unit 111, a camera unit 112, and a sensor unit 113.
  • the operation unit 111 is operated by the user and supplies the operation data corresponding to the operation to the CPU 101.
  • the operation unit 111 is composed of physical buttons and the like.
  • the camera unit 112 generates captured image data and outputs it as sensor data by photoelectrically converting the light from the subject incident therein and performing predetermined signal processing on the electric signal obtained as a result.
  • the camera unit 112 is composed of an image sensor, a signal processing unit, and the like.
  • the sensor unit 113 senses spatial information, time information, etc., and outputs sensor data obtained as a result of the sensing.
  • the sensor unit 113 includes an inertial measurement unit (IMU: Inertial Measurement Unit) that measures three-dimensional angular velocity and acceleration.
  • IMU Inertial Measurement Unit
  • the inertial measurement unit (IMU) can obtain three-dimensional angular velocity and acceleration by using a three-axis gyroscope and a three-way accelerometer.
  • the sensor unit 113 includes a biological sensor that measures information such as the heart rate, body temperature, or posture of a living body, a proximity sensor that measures a proximity object, a magnetic sensor that measures the magnitude and direction of a magnetic field (magnetic field), and the like. Various sensors can be included.
  • the output unit 106 outputs various information and substances (fragrances, etc.) according to the control from the CPU 101 via the input / output interface 110.
  • the output unit 106 has a display unit 121, a sound output unit 122, and a fragrance discharge unit 123.
  • the display unit 121 displays an image or the like according to the image data according to the control from the CPU 101.
  • the display unit 121 is configured as a pair of transmissive displays for the left eye and the right eye.
  • the sound output unit 122 outputs a sound according to the sound data according to the control from the CPU 101.
  • the sound output unit 122 is configured as small headphones arranged at positions close to the left and right ears of the user.
  • the fragrance discharge unit 123 discharges various fragrances prepared in advance according to the control from the CPU 101.
  • the fragrance discharge unit 123 has a fragrance holding structure filled with a liquid fragrance, and air is flowed through a holding space provided in the fragrance holding structure to be held by the fragrance holding body arranged in the holding space. It is released by vaporizing the liquid fragrance.
  • Liquid fragrances include various liquids that can generate scents by vaporizing.
  • the liquid fragrance may include a perfume, a liquid seasoning, or the like.
  • a plurality of types of liquid fragrances are used, a plurality of holding spaces are provided in the fragrance holding structure, and the liquid fragrance held in the fragrance holding body of the holding space selected from the plurality of holding spaces is vaporized and released. do it.
  • the fragrance discharge unit 123 may be built in the terminal device 10 or may be provided in an external housing of the terminal device 10 so as to be controlled by the terminal device 10.
  • the storage unit 107 records various data and programs according to the control from the CPU 101.
  • the CPU 101 reads various data from the storage unit 107, processes the data, and executes a program.
  • the storage unit 107 is configured as an auxiliary storage device for a semiconductor memory or the like.
  • the storage unit 107 may be configured as an internal storage, or may be an external storage such as a memory card.
  • the communication unit 108 communicates with other devices via the network 50 according to the control from the CPU 101.
  • the communication unit 108 is configured as a communication module that supports cellular communication (for example, LTE-Advanced, 5G, etc.), wireless communication such as wireless LAN, or wired communication.
  • cellular communication for example, LTE-Advanced, 5G, etc.
  • wireless communication such as wireless LAN, or wired communication.
  • the short-range wireless communication unit 109 performs wireless communication based on short-range wireless communication standards such as Bluetooth (registered trademark) and NFC (Near Field Communication), and exchanges various data.
  • short-range wireless communication standards such as Bluetooth (registered trademark) and NFC (Near Field Communication)
  • FIG. 3 shows an example of the functional configuration of the control unit 100 included in the terminal device 10.
  • the function of the control unit 100 is realized by executing a program by the CPU 101.
  • control unit 100 includes a sensor data acquisition unit 151, a determination model application unit 152, an emotion data acquisition unit 153, a situation analysis unit 154, and a fragrance discharge control unit 155.
  • the sensor data acquisition unit 151 acquires the sensor data detected by the camera unit 112 or the sensor unit 113 and supplies the sensor data to the determination model application unit 152 and the situation analysis unit 154.
  • the sensor data includes captured image data.
  • the determination model application unit 152 can use the learned determination model provided from the server 20 via the network 50.
  • the determination model application unit 152 applies the determination model to the captured image data supplied from the sensor data acquisition unit 151, and supplies the determination result data to the situation analysis unit 154.
  • the emotion data acquisition unit 153 acquires emotion data related to a person's emotions and supplies it to the situation analysis unit 154.
  • this emotion data is data related to emotions toward the target user, such as sensor data detected by a wearable terminal worn by the target partner, sensor data detected by the terminal device 10, and the like. Includes data for.
  • the situation analysis unit 154 is supplied with sensor data from the sensor data acquisition unit 151, determination result data from the determination model application unit 152, and emotion data from the emotion data acquisition unit 153, respectively.
  • the situation analysis unit 154 analyzes various situations (for example, the situation around the user and the emotions of the target partner) based on the data supplied therein.
  • the situation analysis unit 154 supplies the data of the analysis result to the fragrance discharge control unit 155.
  • the fragrance discharge control unit 155 controls the fragrance discharge unit 123 based on the analysis result data supplied from the situation analysis unit 154 to discharge various fragrances.
  • FIG. 4 shows an example of the configuration of the server 20 of FIG.
  • the CPU 201, ROM 202, and RAM 203 are connected to each other by the bus 204.
  • the CPU 201 controls the operation of each part of the server 20 by executing the program recorded in the ROM 202. Various data are appropriately recorded in the RAM 203.
  • the input / output interface 211 is also connected to the bus 204.
  • An input unit 205, an output unit 206, a storage unit 207, a communication unit 208, and a drive 209 are connected to the input / output interface 211.
  • the input unit 205 supplies various input data to each unit including the CPU 201 via the input / output interface 210.
  • the input unit 205 includes a keyboard, a mouse, a microphone, and the like.
  • the output unit 206 outputs various information according to the control from the CPU 101 via the input / output interface 110.
  • the output unit 206 has a display unit 221 and a sound output unit 222.
  • the display unit 221 displays an image or the like according to the image data according to the control from the CPU 201.
  • the display unit 221 is composed of a liquid crystal panel, an OLED (Organic Light Emitting Diode) panel, a signal processing unit, and the like.
  • the sound output unit 222 outputs a sound according to the sound data according to the control from the CPU 201.
  • the sound output unit 222 is composed of a speaker or the like.
  • the storage unit 207 records various data and programs according to the control from the CPU 201.
  • the CPU 201 reads various data from the storage unit 207, processes the data, and executes a program.
  • the storage unit 207 is configured as an auxiliary storage device such as a semiconductor memory or an HDD (Hard Disk Drive).
  • the storage unit 207 may be configured as an internal storage or may be an external storage.
  • the communication unit 208 communicates with other devices via the network 50 according to the control from the CPU 201.
  • the communication unit 208 is configured as a communication module corresponding to cellular communication (for example, LTE-Advanced, 5G, etc.), wireless communication such as wireless LAN, or wired communication such as Ethernet (registered trademark).
  • a removable recording medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory is appropriately mounted on the drive 209, and the removable recording medium is driven according to the control from the CPU 201.
  • the CPU 201 can control the drive 209 to read the data written in the driven removable recording medium.
  • FIG. 5 shows an example of the functional configuration of the control unit 200 included in the server 20.
  • the function of the control unit 200 is realized by executing a program by the CPU 201.
  • control unit 200 has a learning data acquisition unit 251, a learning DB 252, a learning unit 253, and a providing unit 254.
  • the learning DB 252 is recorded in the storage unit 207.
  • the learning data acquisition unit 251 acquires personal data for each user and stores it in the learning DB 252 as learning data.
  • This personal data includes general information about the individual user, such as data on the compatibility of the user with other people, biometric data, social data, life logs (including behavior patterns and location information), and schedules.
  • the data set used as personal data can include data in any format such as text data, image data, moving image data, and audio data.
  • the learning unit 253 performs learning by machine learning using personal data stored in the learning DB 252, and generates a determination model. This learned determination model is supplied to the providing unit 254.
  • the providing unit 254 controls the communication unit 208 to provide the learned determination model supplied from the learning unit 253 to the terminal device 10 via the network 50. To do.
  • the terminal device 10 worn by the user is provided with a determined determination model learned from the server 20 using personal data such as image data of a person the user likes, and is controlled. It is made available by the unit 100.
  • step S101 the sensor data acquisition unit 151 acquires captured image data captured in real time by the camera unit 112 provided in the terminal device 10 such as a spectacle-type device. At this time, it can be said that the camera unit 112 is operating as a surveillance camera that monitors the surroundings of the user, so to speak.
  • step S102 the determination model application unit 152 applies the learned determination model to the acquired captured image data.
  • step S103 the situation analysis unit 154 is similar to a person who likes the user from the surrounding people within a radius of 10 m, centered on the user, based on the judgment result data of the judgment model, the captured image data, and the like. Determine if a person has been detected.
  • the personal data used as the learning data in the learning process executed by the control unit 200 (learning unit 253) of the server 20 is added to the personal data of occupation, annual income, and marriage.
  • information such as presence / absence, address, hobby, age, DNA (Deoxyribonucleic Acid) information, similar people according to the desired conditions set by the user are detected using the learned judgment model. You may do so.
  • the situation analysis unit 154 sets the desired condition with respect to the surrounding people within a radius of 10 m centering on the user, together with the similarity with the person who likes the user.
  • the score value can be obtained by performing a predetermined arithmetic process in consideration of the above, and the person with the highest score value among the surrounding people can be estimated as a person similar to the person the user likes.
  • the user's terminal device 10 and its surroundings Collate the position data (for example, the position data by GPS (Global Positioning System)) acquired by each device owned by the person, or capture the radio wave by short-range wireless communication such as Bluetooth (registered trademark). You may.
  • GPS Global Positioning System
  • step S103 If it is determined in the determination process of step S103 that a similar person has not been detected, the process returns to step S101, and the processes of steps S101 to S103 are repeated.
  • step S104 the situation analysis unit 154 analyzes the direction in which a person similar to the person the user likes is facing, based on the determination result data of the determination model, the photographed image data, and the like.
  • step S105 the situation analysis unit 154 determines whether or not a person who is similar to the person who likes the user is looking in the direction of the user based on the analysis result of the direction in which the similar person is facing.
  • step S105 If it is determined in the determination process of step S105 that a similar person is looking in the direction of the user, the process proceeds to step S106.
  • step S106 the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge a calming scent to the user.
  • the user is made to evaluate the impression of the scent from the fragrance set prepared in advance, and the library of various scents of each user and the impression of the scent. You just have to build and keep it.
  • This library may be held by the terminal device 10 or may be managed by the server 20 and provided to the terminal device 10 as needed.
  • the impressions of the fragrance to be evaluated include “excitement”, “blessing”, “encouragement”, “awakening”, “passionate”, and “refusal”.
  • Various adjective expressions such as “clean” can be used, and various scents of the fragrance set are assigned to each adjective expression and stored in the library.
  • an evaluation method such as a three-stage or five-level evaluation or a visual evaluation scale (VAS: Visual Analog Scale) can be used.
  • step S105 if it is determined in the determination process of step S105 that a similar person is not looking in the direction of the user, the process proceeds to step S107.
  • the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge the scent that excites the user.
  • control unit 100 controls the discharge of the fragrance based on the behavior information regarding the behavior of the target (a person similar to the person who likes the user) with respect to the user obtained from the sensor data. Further, the control unit 100 changes the type of fragrance depending on whether or not the target is looking in the direction of the user.
  • step S106 or S107 When the process of step S106 or S107 is completed, the process proceeds to step S108 of FIG.
  • step S108 the situation analysis unit 154 determines whether or not the distance to a similar person has approached within 3 m based on the determination result data of the determination model, the captured image data, and the like.
  • step S108 the process proceeds to step S109 after waiting for the distance to a similar person to approach within 3 m.
  • the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to enhance the scent to be discharged.
  • FIG. 8 shows the positional relationship between a user who wears the terminal device 10 as a spectacle-type device (hereinafter, also referred to as user U1) and a person who is similar to the person who likes the user (hereinafter, also referred to as target T1). An example is shown.
  • user U1 a spectacle-type device
  • target T1 a person who is similar to the person who likes the user
  • the target T1 has entered the circular region A, which is an region within a radius of 3 m centered on the position of the user U1.
  • the target T1 when the target T1 is looking at the user U1, the amount of the calming scent that has been discharged increases, and the scent becomes stronger.
  • the target T1 does not look at the user U1, the discharge amount of the excited scent that has been discharged increases, and the scent becomes stronger.
  • the discharge amount of the discharged scent is increased, so that the user U1 can be supported with a more emphasized scent.
  • control unit 100 controls the discharge of the fragrance based on the distance information regarding the distance between the user and the target obtained from the sensor data. Further, in the control unit 100, when the value indicated by the distance information is less than a predetermined threshold value, the discharge amount of the fragrance is increased.
  • step S109 when the process of step S109 is completed, the process proceeds to step S110.
  • step S110 the emotion data acquisition unit 153 acquires emotion data of a person similar to the person the user likes.
  • the emotion data is data related to the emotions of the target T1 (FIG. 8) for the user U1 (FIG. 8).
  • emotional data includes sensor data detected by a wearable terminal worn by target T1 (for example, data related to pulse, brain wave, sweating, body temperature, iris, voice pitch, breathing, etc.) and user U1. It includes data such as sensor data (for example, captured image data obtained by photographing the target T1) detected by the terminal device 10 attached to the device.
  • step S111 the situation analysis unit 154 analyzes the emotions of the similar person by estimating the emotions of a person similar to the person the user likes based on the acquired emotion data.
  • the emotion analysis process of a person who is similar to the person who likes the user may be executed by the server 20 on the cloud.
  • the wearable terminal worn by the target T1 transmits emotion data including the sensor data detected by itself to the server 20 via the network 50. Further, the terminal device 10 worn by the user U1 (FIG. 8) transmits emotion data including the sensor data detected by the user U1 (FIG. 8) to the server 20 via the network 50.
  • the server 20 analyzes the emotion of the target T1 by estimating the emotion of the target T1 based on the emotion data transmitted from the wearable terminal and the terminal device 10.
  • the server 20 transmits the analysis result data to the terminal device 10 via the network 50.
  • the terminal device 10 can acquire the data of the emotion analysis result of the target T1 transmitted from the server 20.
  • step S112 the situation analysis unit 154 determines whether or not the reaction of the similar person is a good reaction based on the data of the analysis result of the emotion of the person similar to the person the user likes.
  • step S112 If it is determined in the determination process of step S112 that the reaction of a similar person is a good reaction (positive emotion), the process proceeds to step S113.
  • step S113 the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge the scent of blessing.
  • step S112 determines whether the reaction of a similar person is a good reaction (negative emotion). If it is determined in the determination process of step S112 that the reaction of a similar person is not a good reaction (negative emotion), the process proceeds to step S114.
  • the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge the scent of encouragement.
  • control unit 100 controls the discharge of the fragrance based on the emotional data regarding the emotions of the surrounding people who are the target.
  • the control unit 100 changes the type of fragrance according to the compatibility with the user regarding the surrounding person who is the target.
  • step S113 or S114 When the process of step S113 or S114 is completed, the process is completed.
  • the control unit 100 of the terminal device 10 such as a spectacle-type device uses a determination model generated by machine learning using personal data for each user, and estimates from the input sensor data. The discharge of fragrance to assist the user with respect to the targeted target is controlled.
  • the scent assists the user's feelings. It will be easier to approach the person who suits your taste.
  • FIG. 9 is a flowchart illustrating a first example of learning processing executed by the control unit 200 of the server 20.
  • This learning process is executed prior to the above-mentioned fragrance support process (FIGS. 6 and 7), and the learned determination model generated here is provided to the control unit 100 of the terminal device 10.
  • step S201 the learning data acquisition unit 251 acquires personal data such as image data of a person the user likes.
  • the image data of a person that the user likes can be selected from stock photos, libraries synthesized using artificial intelligence (AI), and the like.
  • AI artificial intelligence
  • the user himself / herself may operate the device to add an image of his / her favorite person (for example, a celebrity) (or an image of a person he / she dislikes) to the library.
  • personal data can include general information about individual users, such as biological data, social data, life logs, and schedules.
  • step S202 the learning DB 252 accumulates personal data such as acquired image data as learning data.
  • step S203 the learning unit 253 performs learning processing using the learning data accumulated in the learning DB 252, and generates a determination model.
  • Machine learning methods used in this learning process include, for example, neural networks (NN: neural networks), deep learning, support vector machines (SVMs), and self-organizing maps (SOM:). Self-organizing maps) or k-means clustering can be used.
  • NN neural networks
  • SVMs support vector machines
  • SOM self-organizing maps
  • Self-organizing maps or k-means clustering can be used.
  • the learning unit 253 performs pattern extraction by image recognition using the above-mentioned extraction determination algorithm such as deep learning as the learning process.
  • step S204 the providing unit 254 provides the generated learned determination model to the control unit 100 of the terminal device 10.
  • step S204 When the process of step S204 is completed, the process is completed.
  • a determination model required for the control unit 100 to execute the fragrance support process (FIGS. 6 and 7) is generated and provided to the terminal device 10.
  • FIG. 10 is a flowchart illustrating an example of the scent customization process executed by the control unit 200 of the server 20.
  • step S231 the learning data acquisition unit 251 acquires emotional data transmitted from the wearable terminal and the terminal device 10. This emotion data is stored in the learning DB 252 as learning data.
  • step S232 the learning unit 253 performs a learning process using the emotion data accumulated in the learning DB 252, and customizes the scent to be emitted when the user next meets a favorite person.
  • learning is performed including emotion data as well as data related to a library of various scents and impressions of each user. Then, for example, when emotion data regarding emotions (positive emotions or negative emotions) for the user U1 (FIG. 8) by the target T1 (FIG. 8) is input to this trained determination model, information on the customized fragrance is output. Will be done.
  • step S233 the providing unit 254 provides the information regarding the customized scent to the control unit 100 of the terminal device 10.
  • the scent is discharged according to the direction in which a person similar to the user likes the scent based on the customized scent information (discharged by the processing of S106 and S107 in FIG. 6).
  • the scent to be produced can be changed to a more appropriate scent.
  • control unit 100 of the terminal device 10 feeds back the emotion data and controls the discharge of the fragrance customized according to the fed-back emotion data.
  • step S233 When the process of step S233 is completed, the process is completed.
  • scent customization processing has been explained above.
  • the scent discharged by the terminal device 10 is changed according to the reaction (positive emotion or negative emotion) of the other party up to the previous time, so that the user's feelings can be supported with a better scent. Can be done.
  • the terminal device 10 worn by the user is provided with a determined determination model learned from the server 20 using personal data such as image data of a person the user likes, and is controlled. It is made available by the unit 100.
  • step S135 If it is determined in the determination process of step S135 that a similar person is looking in the direction of the user, the process proceeds to step S136.
  • step S136 the fragrance discharge unit 123 is controlled by the fragrance discharge control unit 155 as in the process of step S106, and the fragrance that calms the user is discharged.
  • step S135 determines whether a similar person does not look in the direction of the user. If it is determined in the determination process of step S135 that a similar person does not look in the direction of the user, the process proceeds to step S137.
  • step S137 the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge the scent that excites the user.
  • the appearance of the user is changed to clothes, hairstyle, makeup, eye size and color according to the preference of the target person (a person similar to the person who likes the user). To be done.
  • the target party also wears the terminal device 10 as a spectacle-type device like the user
  • the target partner is the spectacle-type device worn by the target partner
  • the AR technology is used.
  • Information such as clothes and hairstyle may be drawn so that the appearance of the user viewed through the transparent display can be changed by using the above.
  • contact lenses can be used to change the color of the user's eyes, or underwear that can change the shape of a part of the user's body can be used to emphasize the shape of a part of the body. You may change the appearance of the user in the real space.
  • Information about the preference of the target person can be estimated using emotional data transmitted from the wearable terminal worn by the person, or information about the person published on the Internet (for example, social media). Information) can be used for estimation.
  • the server 20 may manage information on the preference of the target partner in the database, or may include information on the preference of the target partner in the learning data when performing the learning process.
  • control unit 100 of the terminal device 10 worn by the target partner changes the appearance of the user when viewed from the target so that the target pays attention when the target does not look in the direction of the user. doing.
  • steps S136 or S137 When the processing of steps S136 or S137 is completed, the subsequent processing is executed, but since it is the same as the processing of steps S108 to S114 of FIG. 7, the description thereof will be omitted.
  • the terminal device 10 worn by the user is provided with a determination model learned from the server 20 using personal data such as image data of a person who is not good at the user, and is provided with a control unit. It is available by 100.
  • step S161 the sensor data acquisition unit 151 acquires captured image data captured in real time by the camera unit 112.
  • step S162 the determination model application unit 152 applies the learned determination model to the acquired captured image data.
  • step S163 the situation analysis unit 154 detected a person who is not good at the user from the surrounding people within a radius of 10 m, centered on the user, based on the judgment result data of the judgment model, the photographed image data, and the like. Determine if.
  • step S163 If it is determined in the determination process of step S163 that a person who is not good at it has not been detected, the process returns to step S161, and the processes of steps S161 to S163 are repeated.
  • step S164 the situation analysis unit 154 analyzes the direction in which the person who is not good at the user is facing, based on the determination result data of the determination model and the captured image data.
  • step S165 the situation analysis unit 154 determines whether or not the person who is not good at the user is looking in the direction of the user based on the analysis result of the direction in which the person who is not good at it is facing.
  • step S165 If it is determined in the determination process of step S165 that a person who is not good at it is looking in the direction of the user, the process proceeds to step S166.
  • step S166 the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge a calming scent to the user.
  • step S165 the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge the awakening scent that encourages quick action.
  • the scent of awakening that encourages quick action is emitted, so that the user moves quickly by the scent and immediately the person who is not good at himself. You can move to a place away from. For example, as the scent of awakening, the scent of mint or the like can be discharged based on the information stored in the library.
  • step S168 the situation analysis unit 154 determines whether or not the distance to the person who is not good is within 3 m based on the determination result data of the determination model, the captured image data, and the like.
  • step S168 the process proceeds to step S169 after waiting for the distance to the person who is not good at approaching within 3 m.
  • the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to enhance the scent to be discharged.
  • step S169 When the process of step S169 is completed, the process is completed.
  • FIG. 13 is a flowchart illustrating a second example of the learning process executed by the control unit 200 of the server 20.
  • This learning process is executed prior to the above-mentioned fragrance support process (FIG. 12), and the learned determination model generated here is provided to the control unit 100 of the terminal device 10.
  • step S261 the learning data acquisition unit 251 acquires personal data such as image data of a person who is not good at the user.
  • step S262 the learning unit 253 performs learning using personal data such as acquired image data, and generates a determination model.
  • step S263 the providing unit 254 provides the generated learned determination model to the control unit 100 of the terminal device 10.
  • step S263 When the process of step S263 is completed, the process is completed.
  • FIG. 14 shows another example of the functional configuration of the control unit 200 included in the server 20.
  • the server 20 has a target data acquisition unit 261, a matching DB 262, a matching processing unit 263, and a matching result providing unit 264.
  • the target data acquisition unit 261 controls the communication unit 208 to acquire the target data transmitted from the terminal device 10 and supplies it to the matching processing unit 263.
  • This target data includes data about a person who is similar to a person who likes a user.
  • the matching DB 262 stores the data required for the matching process. This necessary data includes, for example, data for analyzing compatibility for each user.
  • the matching processing unit 263 refers to the data stored in the matching DB 262 and performs matching processing using the target data supplied from the target data acquisition unit.
  • the matching processing unit 263 supplies the matching processing result data to the matching result providing unit 264.
  • the matching result providing unit 264 controls the communication unit 208 to provide the matching processing result data supplied from the matching processing unit 263 to the terminal device 10.
  • this matching service process a process for matching a favorite person among users wearing the terminal device 10 is performed.
  • 15 and 16 show the processes executed by the terminal device 10-1 worn by the user and the terminal devices 10-2 and 10-3 worn by the user's favorite persons A and B, respectively. Has been done.
  • each of the terminal devices 10-1 to 10-3 is determined to have been learned by using personal data such as image data of a person who likes the user who uses each terminal device 10.
  • personal data such as image data of a person who likes the user who uses each terminal device 10.
  • a model is provided and made available by the control unit 100 of each terminal device 10.
  • the terminal device 10-1 by applying the learned determination model to the captured image data captured in real time (S301, S302), the people around the user are similar to the person who likes the user. It is determined whether or not the person has been detected (S303). Then, when the terminal device 10-1 detects a person similar to the person the user likes, data about the similar person is transmitted to the server 20 as target data (S304).
  • the person A around the favorite person A can be referred to as the favorite person A. It is determined whether or not a person similar to the person who likes is detected (S313). Then, when the favorite person A detects a person similar to the favorite person in the terminal device 10-2, the data about the similar person is transmitted to the server 20 as the target data (S314).
  • the person B around the favorite person B can be referred to. It is determined whether or not a person similar to the person who likes is detected (S323). Then, when the favorite person B detects a person similar to the favorite person in the terminal device 10-3, the data about the similar person is transmitted to the server 20 as the target data (S324).
  • the server 20 that receives the target data transmitted from the terminal devices 10-1 to 10-3 executes the processes of steps S331 to S333 of FIG.
  • step S331 the target data acquisition unit 261 acquires the target data transmitted from the terminal devices 10-1 to 10-3.
  • step S332 the matching processing unit 263 refers to the data stored in the matching DB 262 and performs a matching process using the acquired target data.
  • the ranking of superiority or inferiority of each user is calculated based on information such as the distance between users at that time and the compatibility of each user, and the person whose calculated superiority or inferiority ranking is the most superior. , Determined as a matching partner.
  • the compatibility for example, the degree of mutual preference of the images, the degree of agreement of tastes, the degree of agreement (or disagreement) of DNA information, and the like can be included.
  • the user with the highest degree of matching can be determined as the matching partner based on the compatibility and the distance between the matching users at that time.
  • step S333 the matching result providing unit 264 transmits the matching result data obtained in the matching process to the terminal device 10 according to the matching result. For example, when the user and the favorite person A are matched by the matching process, the matching result data is transmitted to the terminal device 10-1 and the terminal device 10-2 used by the matched person, respectively. ..
  • steps S305 to S308 of FIG. 16 is executed in each of the terminal device 10-1 and the terminal device 10-2 that receive the matching result data transmitted from the server 20.
  • the fragrance discharge unit 123 is controlled by each fragrance discharge control unit 155, so that the scent of curse is discharged (S305).
  • the scent of this blessing is controlled by each fragrance discharge control unit 155 so that the scent becomes stronger as the distance between the user and the favorite person A becomes closer (S305).
  • each situation analysis unit 154 has a feeling of strangeness in either the user or the favorite person A based on the data such as emotion data. Whether or not it is determined (S306).
  • the process of determining whether or not there is a sense of discomfort may be executed by the server 20 by the server 20 acquiring data such as emotional data.
  • the terminal device 10-1 and the terminal device 10-2 receive the data of the determination result regarding the discomfort, which is transmitted from the server 20.
  • step S306 If it is determined in the determination process of step S306 that one of the two persons does not feel uncomfortable, that is, it is determined that neither of the two persons feels uncomfortable, the process proceeds to step S307. Then, in the terminal device 10-1 and the terminal device 10-2, when the distance between the two people is within a radius of 1 m, each fragrance discharge control unit 155 controls each fragrance discharge unit 123. A passionate scent is discharged (S307).
  • the detection process of the person within a certain distance using the captured image data is performed, and for example, the terminal device 10-
  • the position data (for example, the position data by GPS) acquired by each of 1 and the terminal device 10-2 may be collated, or radio waves may be captured by short-range wireless communication such as Bluetooth (registered trademark).
  • step S306 determines whether one of the two persons feels uncomfortable. If it is determined in the determination process of step S306 that one of the two persons feels uncomfortable, the process proceeds to step S308. Then, in the terminal device 10-1 and the terminal device 10-2, the fragrance discharge unit 123 is controlled by each fragrance discharge control unit 155, so that the refused scent is discharged (S308).
  • the feelings of the two people can be positively supported by the scent of curse and the scent of passion.
  • the scent of refusal can support the feelings of at least the person who has a feeling of strangeness.
  • control unit 200 acquires target data related to the target transmitted from the terminal device 10 of each user, and matches according to the compatibility of each user based on the target data for each user.
  • control unit 100 controls the discharge of the fragrance based on the distance information regarding the distance between the matched users obtained from the sensor data of the terminal device 10 of the matched users. Further, the control unit 100 increases the discharge amount of a predetermined type of fragrance as the distance between the matched users becomes closer. Further, the control unit 100 discharges a predetermined type of fragrance when the value indicated by the distance information is less than a predetermined threshold value.
  • control unit 100 discharges a predetermined type of fragrance when at least one of the matched users feels uncomfortable based on the emotion data regarding the emotions of the matched user toward the other party.
  • step S307 or S308 When the process of step S307 or S308 is completed, the process is completed.
  • FIG. 17 shows another example of the configuration of one embodiment of the fragrance support system to which the present technology is applied.
  • the fragrance support system 1 is provided with an information device 30 and a fragrance discharge device 40 in addition to the terminal device 10 and the server 20.
  • the terminal device 10, the information device 30, and the fragrance discharging device 40 are possessed by one user.
  • the information device 30 is configured as, for example, a smartphone, a tablet terminal, a portable music player, a game machine, or the like.
  • the information device 30 has a communication function, can be connected to the network 50, and can communicate with the terminal device 10 by short-range wireless communication such as Bluetooth (registered trademark).
  • the fragrance discharge device 40 has a stick-like shape such as a round bar or a square bar, and has the same function as the fragrance discharge unit 123 (FIG. 2). Therefore, in the configuration of the terminal device 10 shown in FIG. 2, the fragrance discharge unit 123 (FIG. 2) is removed.
  • the fragrance discharge device 40 has a fragrance holding structure filled with a liquid fragrance, and air is flowed through a holding space provided in the fragrance holding structure to be held by the fragrance holding body arranged in the holding space. It is released by vaporizing the liquid fragrance.
  • the fragrance discharge device 40 also has a communication function and can communicate with the terminal device 10 by short-range wireless communication such as Bluetooth (registered trademark).
  • the terminal device 10 exchanges data with the server 20 on the network 50 via the information device 30 to perform the above-mentioned fragrance support process (FIGS. 6, 7, etc.). Execute. Further, when the terminal device 10 discharges various scents when executing the fragrance support process (S106, S107 in FIG. 6, S113, S114 in FIG. 7, etc.), the fragrance discharge control unit 155 (FIG. 3) determines. By controlling the fragrance discharge device 40, the scent is discharged.
  • the device that executes the fragrance support process and the device that discharges the fragrance do not necessarily have to be the same device.
  • the configuration in which the information device 30 such as a smartphone is provided is shown, but the terminal device 10 directly communicates with the server 20 on the network 50 without providing the information device 30.
  • the terminal device 10 directly communicates with the server 20 on the network 50 without providing the information device 30.
  • control unit 100 of FIG. 3 is included in the terminal device 10
  • control unit 200 of FIG. 5 is included in the server 20, but all or one of the functions of the control unit 100 of FIG.
  • the unit may be included in the server 20, or all or part of the functions of the control unit 200 of FIG. 5 may be included in the terminal device 10.
  • the determination model application unit 152 of FIG. 3 and the emotion data acquisition unit 153 may be provided in the server 20 (control unit 200).
  • the terminal device 10 transmits the data to which the determination model is applied to the server 20 via the network 50, so that the terminal device 10 receives the data of the determination result of the learned determination model from the server 20. ..
  • scent customization process shown in FIG. 10 may be executed by the control unit 100 of the terminal device 10 worn by each user. That is, each process of the above-mentioned flowchart is executed by the control unit 100 of the terminal device 10 or the control unit 200 of the server 20, or is executed by the cooperation of the control unit 100 and the control unit 200.
  • the series of processes of the terminal device 10 or the server 20 described above can be executed by hardware or software.
  • the programs constituting the software are installed on the computer of each device.
  • the CPU loads the program recorded in the ROM or the storage unit into the RAM and executes it, so that the above-mentioned series of processes is performed.
  • a program executed by a computer can be recorded and provided on a removable recording medium such as a package medium.
  • the program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the storage unit by attaching a removable recording medium to the drive. Further, the program can be received by the communication unit and installed in the storage unit via a wired or wireless transmission medium. In addition, the program can be installed in advance in the ROM or storage unit.
  • the processing performed by the computer according to the program does not necessarily have to be performed in chronological order in the order described as the flowchart. That is, the processing performed by the computer according to the program also includes processing executed in parallel or individually (for example, parallel processing or processing by an object).
  • the program may be processed by one computer (processor) or may be distributed processed by a plurality of computers.
  • the program may be transferred to a distant computer for execution.
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
  • the embodiment of the present technology is not limited to the above-described embodiment, and various changes can be made without departing from the gist of the present technology.
  • the present technology can have a cloud computing configuration in which one function is shared and jointly processed by a plurality of devices via a network.
  • each step described in the above flowchart can be executed by one device or shared by a plurality of devices. Further, when a plurality of processes are included in one step, the plurality of processes included in the one step can be executed by one device or shared by a plurality of devices.
  • An information processing device including a control unit that controls the discharge of a fragrance to support the user with respect to a target estimated from input sensor data using a judgment model generated by machine learning using personal data for each user.
  • the personal data includes data on the compatibility of the user with others.
  • the sensor data includes captured image data around the user.
  • the control unit uses the determination model to control the discharge of the fragrance to the user who acts by targeting the surrounding people according to the compatibility, which is estimated from the input captured image data.
  • the information processing device according to (1) above.
  • the control unit Acquire the target data related to the target transmitted from each user's terminal, and The information processing device according to (2) above, which performs matching according to the compatibility of each user based on the target data for each user.
  • the control unit controls the discharge of the fragrance based on the distance information regarding the distance between the matched users obtained from the sensor data of the terminal of the matched user. ..
  • the information processing device according to (13), wherein the control unit increases the discharge amount of a predetermined type of fragrance as the distance between the matched users becomes closer.
  • the control unit discharges a predetermined type of fragrance when at least one of the matched users feels uncomfortable based on emotional data regarding the emotions of the matched user with respect to the other party (12) to (15).
  • the information processing device according to any one.
  • the control unit determines the user with the highest degree of matching as the matching partner based on the compatibility and the distance between the matching users at that time. Information processing device described in Crab.
  • the information processing device according to any one of (1) to (11) above which is configured as a wearable terminal having a device for detecting the sensor data.
  • Information processing device An information processing method that uses a judgment model generated by machine learning using personal data for each user to control the discharge of fragrance to support the user with respect to a target estimated from input sensor data.
  • 1 Fragrance support system 10, 10-1 to 10-N terminal device, 20 servers, 50 networks, 100 control units, 101 CPU, 102 ROM, 103 RAM, 105 input unit, 106 output unit, 107 storage unit, 108 communication Unit, 109 short-range wireless communication unit, 111 operation unit, 112 camera unit, 113 sensor unit, 121 display unit, 122 sound output unit, 123 fragrance discharge unit, 151 sensor data acquisition unit, 152 judgment model application unit, 153 emotion data Acquisition unit, 154 situation analysis unit, 155 fragrance discharge control unit, 200 control unit, 201 CPU, 202 ROM, 203 RAM, 205 input unit, 206 output unit, 207 storage unit, 208 communication unit, 209 drive, 251 learning data acquisition Department, 252 learning DB, 253 learning department, 254 providing department, 261 target data acquisition department, 262 matching DB, 263 matching processing department, 264 matching result providing department

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present invention pertains to an information processing device and an information processing method which enable provision of assistance to users by means of fragrance. Provided is an information processing device equipped with a control unit which, by using a determination model generated by machine learning through usage of personal data of each user, controls discharging of a perfume for providing assistance to a user with respect to a target inferred from inputted sensor data. The present invention is applicable to, for example, an aromatic assist system for providing assistance to a user's behavior by means of fragrance.

Description

情報処理装置、及び情報処理方法Information processing device and information processing method
 本技術は、情報処理装置、及び情報処理方法に関し、特に、香りによりユーザを支援することができるようにした情報処理装置、及び情報処理方法に関する。 The present technology relates to an information processing device and an information processing method, and more particularly to an information processing device and an information processing method capable of supporting a user by fragrance.
 近年、映像や音などに応じた視覚や聴覚によるユーザ体験のみならず、香りを発生させてユーザの嗅覚に訴えかけるデバイスが注目されている(例えば、特許文献1参照)。 In recent years, not only the user experience by sight and hearing according to images and sounds, but also devices that generate scents and appeal to the user's sense of smell have been attracting attention (see, for example, Patent Document 1).
国際公開第2018/016153号International Publication No. 2018/016153
 ところで、嗅覚は、ユーザの喜怒哀楽に直結するものであり、香りによりユーザの行動を支援するための技術が求められていた。 By the way, the sense of smell is directly linked to the emotions of the user, and a technique for supporting the user's behavior by the scent has been required.
 本技術はこのような状況に鑑みてなされたものであり、香りによりユーザを支援することができるようにするものである。 This technology was made in view of such a situation, and enables the user to be assisted by the scent.
 本技術の一側面の情報処理装置は、ユーザごとのパーソナルデータを用いた機械学習により生成した判定モデルを用い、入力されたセンサデータから推定されるターゲットに対する前記ユーザを支援するための香料の吐出を制御する制御部を備える情報処理装置である。 The information processing device of one aspect of the present technology uses a judgment model generated by machine learning using personal data for each user, and discharges a fragrance to support the user with respect to a target estimated from input sensor data. It is an information processing device provided with a control unit for controlling the above.
 本技術の一側面の情報処理方法は、情報処理装置が、ユーザごとのパーソナルデータを用いた機械学習により生成した判定モデルを用い、入力されたセンサデータから推定されるターゲットに対する前記ユーザを支援するための香料の吐出を制御する情報処理方法である。 In the information processing method of one aspect of the present technology, the information processing device uses a judgment model generated by machine learning using personal data for each user to support the user with respect to a target estimated from input sensor data. This is an information processing method that controls the discharge of fragrance for the purpose.
 本技術の一側面の情報処理装置、及び情報処理方法においては、ユーザごとのパーソナルデータを用いた機械学習により生成した判定モデルを用い、入力されたセンサデータから推定されるターゲットに対する前記ユーザを支援するための香料の吐出が制御される。 In the information processing device and the information processing method of one aspect of the present technology, the judgment model generated by machine learning using personal data for each user is used to support the user with respect to the target estimated from the input sensor data. The discharge of fragrance is controlled.
 本技術の一側面の情報処理装置は、独立した装置であってもよいし、1つの装置を構成している内部ブロックであってもよい。 The information processing device on one aspect of the present technology may be an independent device or an internal block constituting one device.
本技術を適用した芳香支援システムの一実施の形態の構成の例を示す図である。It is a figure which shows the example of the structure of one Embodiment of the fragrance support system to which this technique is applied. 端末装置の構成の例を示す図である。It is a figure which shows the example of the configuration of a terminal device. 端末装置の機能的構成の例を示す図である。It is a figure which shows the example of the functional configuration of a terminal device. サーバの構成の例を示す図である。It is a figure which shows the example of the configuration of a server. サーバの機能的構成の例を示す図である。It is a figure which shows the example of the functional configuration of a server. 端末装置の芳香支援処理の第1の例を説明するフローチャートである。It is a flowchart explaining the 1st example of the fragrance support processing of a terminal apparatus. 端末装置の芳香支援処理の第1の例を説明するフローチャートである。It is a flowchart explaining the 1st example of the fragrance support processing of a terminal apparatus. ユーザと類似の人との位置関係の例を示す図である。It is a figure which shows the example of the positional relationship between a user and a similar person. サーバの学習処理の第1の例を説明するフローチャートである。It is a flowchart explaining the 1st example of the learning process of a server. サーバの香りカスタマイズ処理の例を説明するフローチャートである。It is a flowchart explaining an example of a scent customization process of a server. 端末装置の芳香支援処理の第2の例を説明するフローチャートである。It is a flowchart explaining the 2nd example of the fragrance support processing of a terminal apparatus. 端末装置の芳香支援処理の第3の例を説明するフローチャートである。It is a flowchart explaining the 3rd example of the fragrance support processing of a terminal apparatus. サーバの学習処理の第2の例を説明するフローチャートである。It is a flowchart explaining the 2nd example of the learning process of a server. サーバの機能的構成の他の例を示す図である。It is a figure which shows another example of the functional configuration of a server. 複数の端末装置とサーバとの間でのマッチングサービス処理の例を説明するフローチャートである。It is a flowchart explaining an example of a matching service process between a plurality of terminal devices and a server. 複数の端末装置とサーバとの間でのマッチングサービス処理の例を説明するフローチャートである。It is a flowchart explaining an example of a matching service process between a plurality of terminal devices and a server. 本技術を適用した芳香支援システムの一実施の形態の構成の他の例を示す図である。It is a figure which shows another example of the structure of one Embodiment of the fragrance support system to which this technique is applied.
<1.第1の実施の形態> <1. First Embodiment>
(芳香支援システムの構成)
 図1は、本技術を適用した芳香支援システムの一実施の形態の構成の例を示している。
(Structure of fragrance support system)
FIG. 1 shows an example of the configuration of an embodiment of an fragrance support system to which the present technology is applied.
 芳香支援システム1は、ユーザに対して、香りによる支援を行うシステムである。芳香支援システム1は、端末装置10-1乃至10-N(N:1以上の整数)とサーバ20から構成され、ネットワーク50を介して相互に接続される。 The fragrance support system 1 is a system that provides scent support to the user. The fragrance support system 1 is composed of terminal devices 10-1 to 10-N (an integer of N: 1 or more) and a server 20, and is connected to each other via a network 50.
 端末装置10-1は、透過型のディスプレイを有する眼鏡型デバイス等のウェアラブル端末として構成される。 The terminal device 10-1 is configured as a wearable terminal such as a spectacle-type device having a transmissive display.
 この眼鏡型デバイスでは、通常の眼鏡においてフレームに取り付けられるレンズの位置に、左眼用と右眼用の一対の透過型のディスプレイが配置されており、ユーザの目の周辺に装着される。そのため、眼鏡型デバイスでは、ユーザの視界を現実空間から切り離すことなく、AR(Augmented Reality)技術等を用いて各種の情報を視界内に描画することができる。 In this spectacle-type device, a pair of transmissive displays for the left eye and the right eye are arranged at the position of the lens attached to the frame in ordinary spectacles, and are worn around the user's eyes. Therefore, in the spectacle-type device, various information can be drawn in the field of view by using AR (Augmented Reality) technology or the like without separating the user's field of view from the real space.
 端末装置10-1は、ネットワーク50を介してサーバ20から提供される学習済みの判定モデルや、カメラ部やセンサ部により取得されたセンサデータなどを用いた判定処理を行い、その判定結果に基づき、ユーザを支援するための香料の吐出を制御する。 The terminal device 10-1 performs a determination process using a learned determination model provided from the server 20 via the network 50, sensor data acquired by the camera unit and the sensor unit, and the like, and based on the determination result. , Control the discharge of fragrance to assist the user.
 なお、端末装置10-1は、眼鏡型デバイスに限らず、例えば、ピンバッチやコンタクトレンズ、黒子などの形状をしたウェアラブル端末であっても構わない。また、このウェアラブル端末としては、例えば、電子タトゥーなど、人の皮膚や衣類等の物質に貼り付けるデバイスを含めることができる。 The terminal device 10-1 is not limited to a spectacle-type device, and may be, for example, a wearable terminal in the shape of a pin batch, a contact lens, a kuroko, or the like. Further, the wearable terminal can include, for example, a device such as an electronic tattoo that is attached to a substance such as human skin or clothing.
 端末装置10-2乃至10-Nは、端末装置10-1と同様に、眼鏡型デバイス等のウェアラブル端末としてそれぞれ構成される。以下の説明では、端末装置10-1乃至10-Nを特に区別する必要がない場合、端末装置10と称する。 The terminal devices 10-2 to 10-N are configured as wearable terminals such as eyeglass-type devices, respectively, like the terminal device 10-1. In the following description, when it is not necessary to distinguish the terminal devices 10-1 to 10-N, they are referred to as the terminal device 10.
 サーバ20は、ネットワーク50を介して端末装置10に対し、各種のサービスを提供する1又は複数のサーバから構成される。 The server 20 is composed of one or a plurality of servers that provide various services to the terminal device 10 via the network 50.
 例えば、サーバ20は、学習データを用いた学習処理を行って判定モデルを生成し、端末装置10に提供する。また、例えば、サーバ20は、端末装置10から送信されてくるデータ等に基づき、マッチング処理を行い、そのマッチング処理結果を端末装置10に提供する。 For example, the server 20 performs a learning process using the learning data to generate a determination model and provides it to the terminal device 10. Further, for example, the server 20 performs a matching process based on data or the like transmitted from the terminal device 10, and provides the matching process result to the terminal device 10.
 ネットワーク50は、インターネット、携帯電話網、イントラネットなどの通信網を含む。すなわち、図1の芳香支援システム1では、通信として、インターネットを介したTCP/IP(Transmission Control Protocol/Internet Protocol)での通信、LTE-Advancedや5G等のセルラー方式の移動通信、及びアクセスポイントを介した無線LAN(Local Area Network)での通信などが採用される。 Network 50 includes communication networks such as the Internet, mobile phone networks, and intranets. That is, in the fragrance support system 1 of FIG. 1, as communication, TCP / IP (Transmission Control Protocol / Internet Protocol) communication via the Internet, cellular mobile communication such as LTE-Advanced and 5G, and an access point are used. Communication via wireless LAN (Local Area Network) is adopted.
(端末装置の構成)
 図2は、図1の端末装置10の構成の例を示している。
(Configuration of terminal device)
FIG. 2 shows an example of the configuration of the terminal device 10 of FIG.
 図2に示すように、端末装置10において、CPU(Central Processing Unit)101、ROM(Read Only Memory)102、及びRAM(Random Access Memory)103は、バス104により相互に接続される。 As shown in FIG. 2, in the terminal device 10, the CPU (Central Processing Unit) 101, the ROM (Read Only Memory) 102, and the RAM (Random Access Memory) 103 are connected to each other by the bus 104.
 CPU101は、ROM102に記録されたプログラムを実行することで、端末装置10の各部の動作を制御する。RAM103には、各種のデータが適宜記録される。 The CPU 101 controls the operation of each part of the terminal device 10 by executing the program recorded in the ROM 102. Various data are appropriately recorded in the RAM 103.
 バス104にはまた、入出力インターフェース110が接続される。入出力インターフェース110には、入力部105、出力部106、記憶部107、通信部108、及び近距離無線通信部109が接続される。 The input / output interface 110 is also connected to the bus 104. An input unit 105, an output unit 106, a storage unit 107, a communication unit 108, and a short-range wireless communication unit 109 are connected to the input / output interface 110.
 入力部105は、各種の入力データを、入出力インターフェース110を介してCPU101を含む各部に供給する。例えば、入力部105は、操作部111、カメラ部112、及びセンサ部113を有する。 The input unit 105 supplies various input data to each unit including the CPU 101 via the input / output interface 110. For example, the input unit 105 has an operation unit 111, a camera unit 112, and a sensor unit 113.
 操作部111は、ユーザによって操作され、その操作に対応する操作データをCPU101に供給する。操作部111は、物理的なボタン等から構成される。 The operation unit 111 is operated by the user and supplies the operation data corresponding to the operation to the CPU 101. The operation unit 111 is composed of physical buttons and the like.
 カメラ部112は、そこに入射される被写体からの光を光電変換してその結果得られる電気信号に対する所定の信号処理を行うことで、撮影画像データを生成し、センサデータとして出力する。カメラ部112は、イメージセンサや信号処理部等から構成される。 The camera unit 112 generates captured image data and outputs it as sensor data by photoelectrically converting the light from the subject incident therein and performing predetermined signal processing on the electric signal obtained as a result. The camera unit 112 is composed of an image sensor, a signal processing unit, and the like.
 センサ部113は、空間情報や時間情報等のセンシングを行い、そのセンシングの結果得られるセンサデータを出力する。 The sensor unit 113 senses spatial information, time information, etc., and outputs sensor data obtained as a result of the sensing.
 センサ部113は、3次元の角速度と加速度を測定する慣性計測装置(IMU:Inertial Measurement Unit)などを含む。慣性計測装置(IMU)は、3軸のジャイロスコープと3方向の加速度計により、3次元の角速度と加速度を求めることができる。 The sensor unit 113 includes an inertial measurement unit (IMU: Inertial Measurement Unit) that measures three-dimensional angular velocity and acceleration. The inertial measurement unit (IMU) can obtain three-dimensional angular velocity and acceleration by using a three-axis gyroscope and a three-way accelerometer.
 また、センサ部113には、生物の持つ心拍数、体温、又は姿勢といった情報を測定する生体センサ、近接するものを測定する近接センサ、磁場(磁界)の大きさや方向を測定する磁気センサなどの各種のセンサを含めることができる。 Further, the sensor unit 113 includes a biological sensor that measures information such as the heart rate, body temperature, or posture of a living body, a proximity sensor that measures a proximity object, a magnetic sensor that measures the magnitude and direction of a magnetic field (magnetic field), and the like. Various sensors can be included.
 出力部106は、入出力インターフェース110を介してCPU101からの制御に従い、各種の情報や物質(香料等)を出力する。例えば、出力部106は、表示部121、音出力部122、及び香料吐出部123を有する。 The output unit 106 outputs various information and substances (fragrances, etc.) according to the control from the CPU 101 via the input / output interface 110. For example, the output unit 106 has a display unit 121, a sound output unit 122, and a fragrance discharge unit 123.
 表示部121は、CPU101からの制御に従い、画像データに応じた画像等を表示する。例えば、端末装置10が眼鏡型デバイスである場合、表示部121は、左眼用と右眼用の一対の透過型のディスプレイとして構成される。 The display unit 121 displays an image or the like according to the image data according to the control from the CPU 101. For example, when the terminal device 10 is a spectacle-type device, the display unit 121 is configured as a pair of transmissive displays for the left eye and the right eye.
 音出力部122は、CPU101からの制御に従い、音データに応じた音を出力する。例えば、端末装置10が眼鏡型デバイスである場合、音出力部122は、ユーザの左右の耳に近接した位置に配置される小型ヘッドホンとして構成される。 The sound output unit 122 outputs a sound according to the sound data according to the control from the CPU 101. For example, when the terminal device 10 is a spectacle-type device, the sound output unit 122 is configured as small headphones arranged at positions close to the left and right ears of the user.
 香料吐出部123は、CPU101からの制御に従い、あらかじめ用意された各種の香料を吐出する。 The fragrance discharge unit 123 discharges various fragrances prepared in advance according to the control from the CPU 101.
 例えば、香料吐出部123は、液体香料が充填される香料保持構造体を有し、香料保持構造体に設けられた保持空間に空気を流して、当該保持空間に配置された香料保持体に保持された液体香料を気化させることで放出する。 For example, the fragrance discharge unit 123 has a fragrance holding structure filled with a liquid fragrance, and air is flowed through a holding space provided in the fragrance holding structure to be held by the fragrance holding body arranged in the holding space. It is released by vaporizing the liquid fragrance.
 液体香料は、気化することで香りを発生させることが可能な様々な液体を含む。例えば、液体香料は、香水や液体調味料などを含んでもよい。複数種類の液体香料を用いる場合には、香料保持構造体に、複数の保持空間を設けて、複数の保持空間から選択された保持空間の香料保持体に保持された液体香料を気化して放出すればよい。 Liquid fragrances include various liquids that can generate scents by vaporizing. For example, the liquid fragrance may include a perfume, a liquid seasoning, or the like. When a plurality of types of liquid fragrances are used, a plurality of holding spaces are provided in the fragrance holding structure, and the liquid fragrance held in the fragrance holding body of the holding space selected from the plurality of holding spaces is vaporized and released. do it.
 なお、香料吐出部123は、端末装置10に内蔵されるほか、端末装置10の外部の筐体内に設けて、端末装置10から制御されるようにしても構わない。 The fragrance discharge unit 123 may be built in the terminal device 10 or may be provided in an external housing of the terminal device 10 so as to be controlled by the terminal device 10.
 記憶部107は、CPU101からの制御に従い、各種のデータやプログラムを記録する。CPU101は、記憶部107から各種のデータを読み出して処理したり、プログラムを実行したりする。 The storage unit 107 records various data and programs according to the control from the CPU 101. The CPU 101 reads various data from the storage unit 107, processes the data, and executes a program.
 記憶部107は、半導体メモリ等の補助記憶装置として構成される。記憶部107は、内部ストレージとして構成されてもよいし、メモリカード等の外部ストレージであってもよい。 The storage unit 107 is configured as an auxiliary storage device for a semiconductor memory or the like. The storage unit 107 may be configured as an internal storage, or may be an external storage such as a memory card.
 通信部108は、CPU101からの制御に従い、ネットワーク50を介して他の機器と通信を行う。通信部108は、セルラー方式の通信(例えばLTE-Advancedや5G等)や、無線LANなどの無線通信、又は有線通信に対応した通信モジュールとして構成される。 The communication unit 108 communicates with other devices via the network 50 according to the control from the CPU 101. The communication unit 108 is configured as a communication module that supports cellular communication (for example, LTE-Advanced, 5G, etc.), wireless communication such as wireless LAN, or wired communication.
 近距離無線通信部109は、Bluetooth(登録商標)やNFC(Near Field Communication)等の近距離無線通信規格による無線通信を行い、各種のデータをやりとりする。 The short-range wireless communication unit 109 performs wireless communication based on short-range wireless communication standards such as Bluetooth (registered trademark) and NFC (Near Field Communication), and exchanges various data.
 図3は、端末装置10に含まれる制御部100の機能的構成の例を示している。制御部100の機能は、CPU101によりプログラムが実行されることで実現される。 FIG. 3 shows an example of the functional configuration of the control unit 100 included in the terminal device 10. The function of the control unit 100 is realized by executing a program by the CPU 101.
 図3において、制御部100は、センサデータ取得部151、判定モデル適用部152、感情データ取得部153、状況分析部154、及び香料吐出制御部155を有する。 In FIG. 3, the control unit 100 includes a sensor data acquisition unit 151, a determination model application unit 152, an emotion data acquisition unit 153, a situation analysis unit 154, and a fragrance discharge control unit 155.
 センサデータ取得部151は、カメラ部112又はセンサ部113により検出されたセンサデータを取得し、判定モデル適用部152及び状況分析部154に供給する。センサデータには、撮影画像データが含まれる。 The sensor data acquisition unit 151 acquires the sensor data detected by the camera unit 112 or the sensor unit 113 and supplies the sensor data to the determination model application unit 152 and the situation analysis unit 154. The sensor data includes captured image data.
 判定モデル適用部152は、サーバ20からネットワーク50を介して提供される学習済みの判定モデルを利用可能である。判定モデル適用部152は、センサデータ取得部151から供給される撮影画像データに判定モデルを適用し、その判定結果のデータを、状況分析部154に供給する。 The determination model application unit 152 can use the learned determination model provided from the server 20 via the network 50. The determination model application unit 152 applies the determination model to the captured image data supplied from the sensor data acquisition unit 151, and supplies the determination result data to the situation analysis unit 154.
 感情データ取得部153は、ある人の感情に関する感情データを取得し、状況分析部154に供給する。 The emotion data acquisition unit 153 acquires emotion data related to a person's emotions and supplies it to the situation analysis unit 154.
 例えば、この感情データは、ターゲットとなる相手のユーザに対する感情に関するデータであって、ターゲットとなる相手が身につけているウェアラブル端末により検出されるセンサデータや、端末装置10により検出されるセンサデータなどのデータを含む。 For example, this emotion data is data related to emotions toward the target user, such as sensor data detected by a wearable terminal worn by the target partner, sensor data detected by the terminal device 10, and the like. Includes data for.
 状況分析部154には、センサデータ取得部151からのセンサデータと、判定モデル適用部152からの判定結果のデータと、感情データ取得部153からの感情データがそれぞれ供給される。 The situation analysis unit 154 is supplied with sensor data from the sensor data acquisition unit 151, determination result data from the determination model application unit 152, and emotion data from the emotion data acquisition unit 153, respectively.
 状況分析部154は、そこに供給されるデータに基づいて、各種の状況(例えばユーザの周辺の状況やターゲットとなる相手の感情)を分析する。状況分析部154は、その分析結果のデータを、香料吐出制御部155に供給する。 The situation analysis unit 154 analyzes various situations (for example, the situation around the user and the emotions of the target partner) based on the data supplied therein. The situation analysis unit 154 supplies the data of the analysis result to the fragrance discharge control unit 155.
 香料吐出制御部155は、状況分析部154から供給される分析結果のデータに基づき、香料吐出部123を制御して、各種の香料を吐出させる。 The fragrance discharge control unit 155 controls the fragrance discharge unit 123 based on the analysis result data supplied from the situation analysis unit 154 to discharge various fragrances.
(サーバの構成)
 図4は、図1のサーバ20の構成の例を示している。
(Server configuration)
FIG. 4 shows an example of the configuration of the server 20 of FIG.
 図4に示すように、サーバ20において、CPU201、ROM202、及びRAM203は、バス204により相互に接続される。 As shown in FIG. 4, in the server 20, the CPU 201, ROM 202, and RAM 203 are connected to each other by the bus 204.
 CPU201は、ROM202に記録されたプログラムを実行することで、サーバ20の各部の動作を制御する。RAM203には、各種のデータが適宜記録される。 The CPU 201 controls the operation of each part of the server 20 by executing the program recorded in the ROM 202. Various data are appropriately recorded in the RAM 203.
 バス204にはまた、入出力インターフェース211が接続される。入出力インターフェース211には、入力部205、出力部206、記憶部207、通信部208、及びドライブ209が接続される。 The input / output interface 211 is also connected to the bus 204. An input unit 205, an output unit 206, a storage unit 207, a communication unit 208, and a drive 209 are connected to the input / output interface 211.
 入力部205は、各種の入力データを、入出力インターフェース210を介してCPU201を含む各部に供給する。例えば、入力部205は、キーボード、マウス、マイクロフォンなどから構成される。 The input unit 205 supplies various input data to each unit including the CPU 201 via the input / output interface 210. For example, the input unit 205 includes a keyboard, a mouse, a microphone, and the like.
 出力部206は、入出力インターフェース110を介してCPU101からの制御に従い、各種の情報を出力する。例えば、出力部206は、表示部221、及び音出力部222を有する。 The output unit 206 outputs various information according to the control from the CPU 101 via the input / output interface 110. For example, the output unit 206 has a display unit 221 and a sound output unit 222.
 表示部221は、CPU201からの制御に従い、画像データに応じた画像等を表示する。表示部221は、液晶パネルやOLED(Organic Light Emitting Diode)パネルや信号処理部等から構成される。 The display unit 221 displays an image or the like according to the image data according to the control from the CPU 201. The display unit 221 is composed of a liquid crystal panel, an OLED (Organic Light Emitting Diode) panel, a signal processing unit, and the like.
 音出力部222は、CPU201からの制御に従い、音データに応じた音を出力する。音出力部222は、スピーカ等から構成される。 The sound output unit 222 outputs a sound according to the sound data according to the control from the CPU 201. The sound output unit 222 is composed of a speaker or the like.
 記憶部207は、CPU201からの制御に従い、各種のデータやプログラムを記録する。CPU201は、記憶部207から各種のデータを読み出して処理したり、プログラムを実行したりする。 The storage unit 207 records various data and programs according to the control from the CPU 201. The CPU 201 reads various data from the storage unit 207, processes the data, and executes a program.
 記憶部207は、半導体メモリやHDD(Hard Disk Drive)等の補助記憶装置として構成される。記憶部207は、内部ストレージとして構成されてもよいし、外部ストレージであってもよい。 The storage unit 207 is configured as an auxiliary storage device such as a semiconductor memory or an HDD (Hard Disk Drive). The storage unit 207 may be configured as an internal storage or may be an external storage.
 通信部208は、CPU201からの制御に従い、ネットワーク50を介して他の機器と通信を行う。通信部208は、セルラー方式の通信(例えばLTE-Advancedや5G等)や、無線LAN等の無線通信、又はイーサーネット(登録商標)等の有線通信に対応した通信モジュールとして構成される。 The communication unit 208 communicates with other devices via the network 50 according to the control from the CPU 201. The communication unit 208 is configured as a communication module corresponding to cellular communication (for example, LTE-Advanced, 5G, etc.), wireless communication such as wireless LAN, or wired communication such as Ethernet (registered trademark).
 ドライブ209は、磁気ディスク、光ディスク、光磁気ディスク、又は半導体メモリ等のリムーバブル記録媒体が適宜装着され、CPU201からの制御に従い、それらのリムーバブル記録媒体を駆動する。CPU201は、ドライブ209を制御して、駆動されたリムーバブル記録媒体に書き込まれたデータを読み出すことができる。 A removable recording medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory is appropriately mounted on the drive 209, and the removable recording medium is driven according to the control from the CPU 201. The CPU 201 can control the drive 209 to read the data written in the driven removable recording medium.
 図5は、サーバ20に含まれる制御部200の機能的構成の例を示している。制御部200の機能は、CPU201によりプログラムが実行されることで実現される。 FIG. 5 shows an example of the functional configuration of the control unit 200 included in the server 20. The function of the control unit 200 is realized by executing a program by the CPU 201.
 図5において、制御部200は、学習データ取得部251、学習用DB252、学習部253、及び提供部254を有する。学習用DB252は、記憶部207に記録されている。 In FIG. 5, the control unit 200 has a learning data acquisition unit 251, a learning DB 252, a learning unit 253, and a providing unit 254. The learning DB 252 is recorded in the storage unit 207.
 学習データ取得部251は、ユーザごとのパーソナルデータを取得し、学習データとして学習用DB252に格納する。 The learning data acquisition unit 251 acquires personal data for each user and stores it in the learning DB 252 as learning data.
 このパーソナルデータは、ユーザについての他の人との相性に関するデータ、生体データ、社会データ、ライフログ(行動パターンや位置情報を含む)、及びスケジュールなど、ユーザ個人に関する情報全般を含んでいる。また、パーソナルデータとして用いられるデータセットには、テキストデータ、画像データ、動画データ、音声データなど、あらゆる形式のデータを含めることが可能である。 This personal data includes general information about the individual user, such as data on the compatibility of the user with other people, biometric data, social data, life logs (including behavior patterns and location information), and schedules. In addition, the data set used as personal data can include data in any format such as text data, image data, moving image data, and audio data.
 学習部253は、学習用DB252に格納されたパーソナルデータを用いた機械学習によって学習を行い、判定モデルを生成する。この学習済みの判定モデルは、提供部254に供給される。 The learning unit 253 performs learning by machine learning using personal data stored in the learning DB 252, and generates a determination model. This learned determination model is supplied to the providing unit 254.
 提供部254は、端末装置10からの要求等がなされた場合に、通信部208を制御して、学習部253から供給される学習済みの判定モデルを、ネットワーク50を介して端末装置10に提供する。 When a request or the like is made from the terminal device 10, the providing unit 254 controls the communication unit 208 to provide the learned determination model supplied from the learning unit 253 to the terminal device 10 via the network 50. To do.
 以上、芳香支援システム1の構成を説明した。 The configuration of the fragrance support system 1 has been explained above.
 次に、図6乃至図16を参照しながら、芳香支援システム1の各装置により実行される処理の詳細について説明する。 Next, the details of the processing executed by each device of the fragrance support system 1 will be described with reference to FIGS. 6 to 16.
(芳香支援処理の流れ)
 まず、図6と図7のフローチャートを参照して、端末装置10の制御部100により実行される芳香支援処理の第1の例を説明する。
(Flow of fragrance support processing)
First, a first example of the fragrance support process executed by the control unit 100 of the terminal device 10 will be described with reference to the flowcharts of FIGS. 6 and 7.
 この芳香支援処理を実行するに際して、ユーザが身につけた端末装置10には、サーバ20から、当該ユーザが好きな人の画像データ等のパーソナルデータを用いて学習済みの判定モデルが提供され、制御部100により利用可能となっている。 When executing this fragrance support process, the terminal device 10 worn by the user is provided with a determined determination model learned from the server 20 using personal data such as image data of a person the user likes, and is controlled. It is made available by the unit 100.
 ステップS101において、センサデータ取得部151は、眼鏡型デバイス等の端末装置10に設けられたカメラ部112によりリアルタイムで撮影された撮影画像データを取得する。このとき、カメラ部112は、いわばユーザの周囲を監視する監視カメラとして稼働していると言える。 In step S101, the sensor data acquisition unit 151 acquires captured image data captured in real time by the camera unit 112 provided in the terminal device 10 such as a spectacle-type device. At this time, it can be said that the camera unit 112 is operating as a surveillance camera that monitors the surroundings of the user, so to speak.
 ステップS102において、判定モデル適用部152は、取得した撮影画像データに、学習済みの判定モデルを適用する。 In step S102, the determination model application unit 152 applies the learned determination model to the acquired captured image data.
 ステップS103において、状況分析部154は、判定モデルの判定結果のデータや撮影画像データ等に基づいて、ユーザを中心に、半径10m以内にいる周囲の人から、当該ユーザが好きな人と類似の人を検出したかどうかを判定する。 In step S103, the situation analysis unit 154 is similar to a person who likes the user from the surrounding people within a radius of 10 m, centered on the user, based on the judgment result data of the judgment model, the captured image data, and the like. Determine if a person has been detected.
 ユーザが好きな人と類似の人を検出するに際しては、サーバ20の制御部200(の学習部253)で実行される学習処理で、学習データとして用いられるパーソナルデータに、職業、年収、結婚の有無、住所、趣味、年齢、DNA(Deoxyribonucleic Acid)情報などの情報を含めておくことで、学習済みの判定モデルを用いて、当該ユーザが設定した希望条件に応じた類似の人が検出されるようにしてもよい。 When a user detects a person similar to a favorite person, the personal data used as the learning data in the learning process executed by the control unit 200 (learning unit 253) of the server 20 is added to the personal data of occupation, annual income, and marriage. By including information such as presence / absence, address, hobby, age, DNA (Deoxyribonucleic Acid) information, similar people according to the desired conditions set by the user are detected using the learned judgment model. You may do so.
 例えば、ユーザにより希望条件が設定された場合、状況分析部154は、ユーザを中心に、半径10m以内にいる周囲の人について、当該ユーザが好きな人との類似度とともに、設定された希望条件を加味して所定の演算処理を行うことによりスコア値を求めて、周囲の人のうちの最もスコア値の高い人を、当該ユーザが好きな人と類似の人として推定することができる。 For example, when a desired condition is set by the user, the situation analysis unit 154 sets the desired condition with respect to the surrounding people within a radius of 10 m centering on the user, together with the similarity with the person who likes the user. The score value can be obtained by performing a predetermined arithmetic process in consideration of the above, and the person with the highest score value among the surrounding people can be estimated as a person similar to the person the user likes.
 また、半径10m以内等の所定の範囲内にいる周囲の人を検出する際には、撮影画像データを用いた一定距離内の人の検出処理を行うほか、例えば、ユーザの端末装置10と周囲の人が所持する機器のそれぞれで取得される位置データ(例えば、GPS(Global Positioning System)による位置データ)を照合したり、Bluetooth(登録商標)等の近距離無線通信による電波捕捉を行ったりしてもよい。 Further, when detecting a surrounding person within a predetermined range such as within a radius of 10 m, in addition to performing detection processing of a person within a certain distance using captured image data, for example, the user's terminal device 10 and its surroundings Collate the position data (for example, the position data by GPS (Global Positioning System)) acquired by each device owned by the person, or capture the radio wave by short-range wireless communication such as Bluetooth (registered trademark). You may.
 ステップS103の判定処理で、類似の人を未検出であると判定された場合、処理は、ステップS101に戻り、ステップS101乃至S103の処理が繰り返される。 If it is determined in the determination process of step S103 that a similar person has not been detected, the process returns to step S101, and the processes of steps S101 to S103 are repeated.
 ステップS103の判定処理で、類似の人を検出したと判定された場合、処理は、ステップS104に進められる。ステップS104において、状況分析部154は、判定モデルの判定結果のデータや撮影画像データ等に基づいて、ユーザが好きな人と類似の人が向いている方向を分析する。 If it is determined in the determination process of step S103 that a similar person has been detected, the process proceeds to step S104. In step S104, the situation analysis unit 154 analyzes the direction in which a person similar to the person the user likes is facing, based on the determination result data of the determination model, the photographed image data, and the like.
 ステップS105において、状況分析部154は、類似の人が向いている方向の分析結果に基づいて、ユーザが好きな人と類似の人が当該ユーザの方向を見ているかどうかを判定する。 In step S105, the situation analysis unit 154 determines whether or not a person who is similar to the person who likes the user is looking in the direction of the user based on the analysis result of the direction in which the similar person is facing.
 ステップS105の判定処理で、類似の人がユーザの方向を見ていると判定された場合、処理は、ステップS106に進められる。ステップS106において、香料吐出制御部155は、香料吐出部123を制御して、ユーザが落ち着く香りを吐出する。 If it is determined in the determination process of step S105 that a similar person is looking in the direction of the user, the process proceeds to step S106. In step S106, the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge a calming scent to the user.
 ここでは、ユーザごとに香りに対する印象が異なるため、例えば、事前に、ユーザに対して、あらかじめ用意した香料セットから香りへの印象を評価させて、各ユーザの種々の香りとそれに対する印象のライブラリを構築して保持しておけばよい。 Here, since the impression of the scent is different for each user, for example, the user is made to evaluate the impression of the scent from the fragrance set prepared in advance, and the library of various scents of each user and the impression of the scent. You just have to build and keep it.
 これにより、ユーザは、香りをかいだ結果にどのような感情を望むかを設定することができる。なお、このライブラリは、端末装置10が保持してもよいし、サーバ20が管理して必要に応じて端末装置10に提供してもよい。 This allows the user to set what kind of emotion he wants for the result of smelling. This library may be held by the terminal device 10 or may be managed by the server 20 and provided to the terminal device 10 as needed.
 例えば、評価させる香りへの印象としては、上述の「落ち着く」のほか、「興奮する」、「祝福の」、「励ましの」、「覚醒の」、「情熱的な」、「お断りの」、「すっきりする」などの各種の形容詞表現を用いることができ、それらの形容詞表現ごとに、香料セットの種々の香りが割り当てられてライブラリに保持される。また、香りへの印象の評価としては、3段階や5段階の評価、又は視覚的評価スケール(VAS:Visual Analog Scale)などの評価手法を用いることができる。 For example, in addition to the above-mentioned "calm", the impressions of the fragrance to be evaluated include "excitement", "blessing", "encouragement", "awakening", "passionate", and "refusal". Various adjective expressions such as "clean" can be used, and various scents of the fragrance set are assigned to each adjective expression and stored in the library. Further, as the evaluation of the impression of the scent, an evaluation method such as a three-stage or five-level evaluation or a visual evaluation scale (VAS: Visual Analog Scale) can be used.
 このように、自分が好きな人と似ている人がこちらを見ている場合に、落ち着く香りが吐出されるため、ユーザは、その香りによって気持ちを落ち着かせて、次の行動を行うことができる。 In this way, when a person who is similar to the person you like is looking at you, a calming scent is emitted, so the user can calm down by the scent and take the next action. it can.
 一方で、ステップS105の判定処理で、類似の人がユーザの方向を見ていないと判定された場合、処理は、ステップS107に進められる。ステップS107において、香料吐出制御部155は、香料吐出部123を制御して、ユーザが興奮する香りを吐出する。 On the other hand, if it is determined in the determination process of step S105 that a similar person is not looking in the direction of the user, the process proceeds to step S107. In step S107, the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge the scent that excites the user.
 このように、自分が好きな人と似ている人がこちらを見ていない場合には、興奮する香りが吐出されるため、ユーザは、その香りによって気持ちを高ぶらせて、勢いをつけて次の行動を行うことができる。 In this way, when a person who is similar to the person you like is not looking at you, an exciting scent is emitted, and the user is excited by the scent and gains momentum. Can take action.
 すなわち、制御部100では、センサデータから得られる、ユーザに対するターゲット(ユーザが好きな人と類似の人)の行動に関する行動情報に基づき、香料の吐出を制御している。また、制御部100では、ターゲットがユーザの方向を見ているか否かに応じて、香料の種類を変更している。 That is, the control unit 100 controls the discharge of the fragrance based on the behavior information regarding the behavior of the target (a person similar to the person who likes the user) with respect to the user obtained from the sensor data. Further, the control unit 100 changes the type of fragrance depending on whether or not the target is looking in the direction of the user.
 ステップS106,又はS107の処理が終了すると、処理は、図7のステップS108に進められる。 When the process of step S106 or S107 is completed, the process proceeds to step S108 of FIG.
 ステップS108において、状況分析部154は、判定モデルの判定結果のデータや撮影画像データ等に基づいて、類似の人との距離が3m以内に近づいたかどうかを判定する。 In step S108, the situation analysis unit 154 determines whether or not the distance to a similar person has approached within 3 m based on the determination result data of the determination model, the captured image data, and the like.
 ステップS108の判定処理では、類似の人との距離が3m以内に近づくのを待って、処理は、ステップS109に進められる。ステップS109において、香料吐出制御部155は、香料吐出部123を制御して、吐出する香りを強くする。 In the determination process of step S108, the process proceeds to step S109 after waiting for the distance to a similar person to approach within 3 m. In step S109, the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to enhance the scent to be discharged.
 図8は、眼鏡型デバイスとしての端末装置10を身につけたユーザ(以下、ユーザU1ともいう)と、当該ユーザが好きな人と類似の人(以下、ターゲットT1ともいう)との位置関係の例を示している。 FIG. 8 shows the positional relationship between a user who wears the terminal device 10 as a spectacle-type device (hereinafter, also referred to as user U1) and a person who is similar to the person who likes the user (hereinafter, also referred to as target T1). An example is shown.
 図8においては、ユーザU1の位置を中心にして半径3m以内の領域である円形領域Aに、ターゲットT1が進入している。 In FIG. 8, the target T1 has entered the circular region A, which is an region within a radius of 3 m centered on the position of the user U1.
 このとき、ターゲットT1がユーザU1を見ている場合、吐出していた落ち着く香りの吐出量が増加して、その香りが強くなる。一方で、ターゲットT1がユーザU1を見ていない場合、吐出していた興奮する香りの吐出量が増加して、その香りが強くなる。 At this time, when the target T1 is looking at the user U1, the amount of the calming scent that has been discharged increases, and the scent becomes stronger. On the other hand, when the target T1 does not look at the user U1, the discharge amount of the excited scent that has been discharged increases, and the scent becomes stronger.
 このように、ターゲットT1がユーザU1に近づくほど、吐出している香りの吐出量を増やすことで、より強調された香りでユーザU1を支援することができる。 In this way, as the target T1 approaches the user U1, the discharge amount of the discharged scent is increased, so that the user U1 can be supported with a more emphasized scent.
 すなわち、制御部100では、センサデータから得られる、ユーザとターゲットとの間の距離に関する距離情報に基づき、香料の吐出を制御している。また、制御部100では、当該距離情報の示す値が、所定の閾値未満となる場合、香料の吐出量を増加させている。 That is, the control unit 100 controls the discharge of the fragrance based on the distance information regarding the distance between the user and the target obtained from the sensor data. Further, in the control unit 100, when the value indicated by the distance information is less than a predetermined threshold value, the discharge amount of the fragrance is increased.
 図7の説明に戻り、ステップS109の処理が終了すると、処理は、ステップS110に進められる。 Returning to the description of FIG. 7, when the process of step S109 is completed, the process proceeds to step S110.
 ステップS110において、感情データ取得部153は、ユーザが好きな人と類似の人の感情データを取得する。 In step S110, the emotion data acquisition unit 153 acquires emotion data of a person similar to the person the user likes.
 感情データは、ターゲットT1(図8)によるユーザU1(図8)に対する感情に関するデータである。例えば、感情データは、ターゲットT1が身につけているウェアラブル端末により検出されるセンサデータ(例えば、脈拍、脳波、発汗、体温、虹彩、声の高さ、呼吸などに関するデータ)や、ユーザU1が身につけている端末装置10により検出されるセンサデータ(例えば、ターゲットT1を撮影した撮影画像データ)などのデータを含む。 The emotion data is data related to the emotions of the target T1 (FIG. 8) for the user U1 (FIG. 8). For example, emotional data includes sensor data detected by a wearable terminal worn by target T1 (for example, data related to pulse, brain wave, sweating, body temperature, iris, voice pitch, breathing, etc.) and user U1. It includes data such as sensor data (for example, captured image data obtained by photographing the target T1) detected by the terminal device 10 attached to the device.
 ステップS111において、状況分析部154は、取得した感情データに基づいて、ユーザが好きな人と類似の人の情動を推定するなどして、当該類似の人の感情を分析する。 In step S111, the situation analysis unit 154 analyzes the emotions of the similar person by estimating the emotions of a person similar to the person the user likes based on the acquired emotion data.
 なお、ユーザが好きな人と類似の人の感情の分析処理は、クラウド上のサーバ20により実行されても構わない。 Note that the emotion analysis process of a person who is similar to the person who likes the user may be executed by the server 20 on the cloud.
 すなわち、ターゲットT1(図8)が身につけているウェアラブル端末は、自己が検出したセンサデータを含む感情データを、ネットワーク50を介してサーバ20に送信する。また、ユーザU1(図8)が身につけている端末装置10は、自己が検出したセンサデータを含む感情データを、ネットワーク50を介してサーバ20に送信する。 That is, the wearable terminal worn by the target T1 (FIG. 8) transmits emotion data including the sensor data detected by itself to the server 20 via the network 50. Further, the terminal device 10 worn by the user U1 (FIG. 8) transmits emotion data including the sensor data detected by the user U1 (FIG. 8) to the server 20 via the network 50.
 サーバ20は、ウェアラブル端末と端末装置10から送信されてくる感情データに基づき、ターゲットT1の情動を推定するなどして、ターゲットT1の感情を分析する。サーバ20は、その分析結果のデータを、ネットワーク50を介して端末装置10に送信する。これにより、端末装置10は、サーバ20から送信されてくる、ターゲットT1の感情の分析結果のデータを取得することができる。 The server 20 analyzes the emotion of the target T1 by estimating the emotion of the target T1 based on the emotion data transmitted from the wearable terminal and the terminal device 10. The server 20 transmits the analysis result data to the terminal device 10 via the network 50. As a result, the terminal device 10 can acquire the data of the emotion analysis result of the target T1 transmitted from the server 20.
 ステップS112において、状況分析部154は、ユーザが好きな人と類似の人の感情の分析結果のデータに基づいて、当該類似の人の反応が良い反応であるかどうかを判定する。 In step S112, the situation analysis unit 154 determines whether or not the reaction of the similar person is a good reaction based on the data of the analysis result of the emotion of the person similar to the person the user likes.
 ステップS112の判定処理で、類似の人の反応が良い反応である(正の感情である)と判定された場合、処理は、ステップS113に進められる。ステップS113において、香料吐出制御部155は、香料吐出部123を制御して、祝福の香りを吐出する。 If it is determined in the determination process of step S112 that the reaction of a similar person is a good reaction (positive emotion), the process proceeds to step S113. In step S113, the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge the scent of blessing.
 このように、自分が好きな人と似ている人がこちらに近づいて来たときに、その類似の人の反応が良いものである場合、ユーザを祝福する香りが吐出されるため、ユーザは、その香りによって幸せな気持ちになることができる。 In this way, when a person who is similar to the person you like approaches us, if the person who is similar to you is responsive, the scent that congratulates the user is emitted, so that the user can use it. , The scent can make you happy.
 一方で、ステップS112の判定処理で、類似の人の反応が良い反応ではない(負の感情である)と判定された場合、処理は、ステップS114に進められる。ステップS114において、香料吐出制御部155は、香料吐出部123を制御して、励ましの香りを吐出する。 On the other hand, if it is determined in the determination process of step S112 that the reaction of a similar person is not a good reaction (negative emotion), the process proceeds to step S114. In step S114, the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge the scent of encouragement.
 このように、自分が好きな人と似ている人がこちらに近づいて来たが、その類似の人の反応が良いものではない場合には、励ましの香りが吐出されるため、ユーザは、その香りによって気持ちの落ち込みを最低限に止めることができる。 In this way, if a person who is similar to the person you like approaches you, but the reaction of the similar person is not good, the scent of encouragement will be emitted, so the user will The scent can minimize depression.
 すなわち、制御部100では、ターゲットである周囲の人によるユーザに対する感情に関する感情データに基づき、香料の吐出を制御している。また、制御部100では、ターゲットである周囲の人についてのユーザとの相性に応じて、香料の種類を変更している。 That is, the control unit 100 controls the discharge of the fragrance based on the emotional data regarding the emotions of the surrounding people who are the target. In addition, the control unit 100 changes the type of fragrance according to the compatibility with the user regarding the surrounding person who is the target.
 ステップS113,又はS114の処理が終了すると、処理を終了する。 When the process of step S113 or S114 is completed, the process is completed.
 以上、芳香支援処理の第1の例の流れを説明した。この芳香支援処理の第1の例では、眼鏡型デバイス等の端末装置10の制御部100によって、ユーザごとのパーソナルデータを用いた機械学習により生成した判定モデルを用い、入力されたセンサデータから推定されるターゲットに対するユーザを支援するための香料の吐出が制御される。 The flow of the first example of the aroma support treatment has been explained above. In the first example of this fragrance support processing, the control unit 100 of the terminal device 10 such as a spectacle-type device uses a determination model generated by machine learning using personal data for each user, and estimates from the input sensor data. The discharge of fragrance to assist the user with respect to the targeted target is controlled.
 これにより、眼鏡型デバイス等の端末装置10を身につけたユーザの周囲の人から、当該ユーザが最も自分の好みに合った相手を探索する際に、香りによりユーザの気持ちを支援することで、自分の好みに合った相手にアプローチし易くなる。 As a result, when the user around the user who wears the terminal device 10 such as a spectacle-type device searches for a person who best suits his / her taste, the scent assists the user's feelings. It will be easier to approach the person who suits your taste.
(学習処理の流れ)
 図9は、サーバ20の制御部200により実行される学習処理の第1の例を説明するフローチャートである。
(Flow of learning process)
FIG. 9 is a flowchart illustrating a first example of learning processing executed by the control unit 200 of the server 20.
 この学習処理は、上述した芳香支援処理(図6,図7)に先立って実行され、ここで生成した学習済みの判定モデルが、端末装置10の制御部100に提供される。 This learning process is executed prior to the above-mentioned fragrance support process (FIGS. 6 and 7), and the learned determination model generated here is provided to the control unit 100 of the terminal device 10.
 ステップS201において、学習データ取得部251は、ユーザが好きな人の画像データ等のパーソナルデータを取得する。 In step S201, the learning data acquisition unit 251 acquires personal data such as image data of a person the user likes.
 例えば、ユーザが好きな人の画像データは、ストックフォトや、人工知能(AI:artificial intelligence)を用いて合成されたライブラリなどから選択することができる。また、ユーザ自身が機器を操作して、自分の好みの人(例えば有名人)の画像(又は嫌いな人の画像)をライブラリに追加してもよい。 For example, the image data of a person that the user likes can be selected from stock photos, libraries synthesized using artificial intelligence (AI), and the like. In addition, the user himself / herself may operate the device to add an image of his / her favorite person (for example, a celebrity) (or an image of a person he / she dislikes) to the library.
 また、パーソナルデータとしては、例えば、生体データ、社会データ、ライフログ、及びスケジュールなど、ユーザ個人に関する情報全般を含めることができる。 In addition, personal data can include general information about individual users, such as biological data, social data, life logs, and schedules.
 ステップS202において、学習用DB252は、取得した画像データ等のパーソナルデータを、学習データとして蓄積する。 In step S202, the learning DB 252 accumulates personal data such as acquired image data as learning data.
 ステップS203において、学習部253は、学習用DB252に蓄積された学習データを用いた学習処理を行い、判定モデルを生成する。 In step S203, the learning unit 253 performs learning processing using the learning data accumulated in the learning DB 252, and generates a determination model.
 この学習処理で用いられる機械学習の手法としては、例えば、ニューラルネットワーク(NN:neural network)や、深層学習(deep learning)、サポートベクタマシン(SVM:support vector machine)、自己組織化写像(SOM:self-organizing maps)、又はk平均法(k-means clustering)を用いることができる。 Machine learning methods used in this learning process include, for example, neural networks (NN: neural networks), deep learning, support vector machines (SVMs), and self-organizing maps (SOM:). Self-organizing maps) or k-means clustering can be used.
 例えば、学習データとして、ユーザが好きな人の画像データが用いられる場合、学習部253では、学習処理として、上述した深層学習等の抽出判定アルゴリズムを用いた画像認識によりパターン抽出が行われる。 For example, when image data of a person that the user likes is used as the learning data, the learning unit 253 performs pattern extraction by image recognition using the above-mentioned extraction determination algorithm such as deep learning as the learning process.
 ステップS204において、提供部254は、生成した学習済みの判定モデルを、端末装置10の制御部100に提供する。 In step S204, the providing unit 254 provides the generated learned determination model to the control unit 100 of the terminal device 10.
 ステップS204の処理が終了すると、処理を終了する。 When the process of step S204 is completed, the process is completed.
 以上、学習処理の第1の例の流れを説明した。この学習処理の第1の例では、制御部100が芳香支援処理(図6,図7)を実行するために必要となる判定モデルが生成され、端末装置10に提供される。 The flow of the first example of the learning process has been explained above. In the first example of this learning process, a determination model required for the control unit 100 to execute the fragrance support process (FIGS. 6 and 7) is generated and provided to the terminal device 10.
(香りカスタマイズ処理の流れ)
 図10は、サーバ20の制御部200により実行される香りカスタマイズ処理の例を説明するフローチャートである。
(Flow of scent customization process)
FIG. 10 is a flowchart illustrating an example of the scent customization process executed by the control unit 200 of the server 20.
 ステップS231において、学習データ取得部251は、ウェアラブル端末と端末装置10から送信されてくる感情データを取得する。この感情データは、学習データとして学習用DB252に蓄積される。 In step S231, the learning data acquisition unit 251 acquires emotional data transmitted from the wearable terminal and the terminal device 10. This emotion data is stored in the learning DB 252 as learning data.
 ステップS232において、学習部253は、学習用DB252に蓄積された感情データを用いた学習処理を行い、次にユーザが好きな人に会ったときに吐出する香りをカスタマイズする。 In step S232, the learning unit 253 performs a learning process using the emotion data accumulated in the learning DB 252, and customizes the scent to be emitted when the user next meets a favorite person.
 例えば、この学習処理では、学習用DB252に蓄積される学習データとして、各ユーザの種々の香りとそれに対する印象のライブラリに関するデータとともに、感情データを含めて学習するようにする。そして、例えば、この学習済みの判定モデルに、ターゲットT1(図8)によるユーザU1(図8)に対する感情(正の感情又は負の感情)に関する感情データを入力すると、カスタマイズした香りに関する情報が出力される。 For example, in this learning process, as learning data accumulated in the learning DB 252, learning is performed including emotion data as well as data related to a library of various scents and impressions of each user. Then, for example, when emotion data regarding emotions (positive emotions or negative emotions) for the user U1 (FIG. 8) by the target T1 (FIG. 8) is input to this trained determination model, information on the customized fragrance is output. Will be done.
 ステップS233において、提供部254は、カスタマイズした香りに関する情報を、端末装置10の制御部100に提供する。これにより、例えば、端末装置10では、カスタマイズした香りに関する情報に基づき、ユーザが好きな人と類似の人が向いている方向に応じて吐出される香り(図6のS106,S107の処理で吐出される香り)を、より適切な香りに変更することができる。 In step S233, the providing unit 254 provides the information regarding the customized scent to the control unit 100 of the terminal device 10. As a result, for example, in the terminal device 10, the scent is discharged according to the direction in which a person similar to the user likes the scent based on the customized scent information (discharged by the processing of S106 and S107 in FIG. 6). The scent to be produced) can be changed to a more appropriate scent.
 すなわち、端末装置10の制御部100では、感情データをフィードバックし、フィードバックした感情データに応じてカスタマイズされた香料の吐出を制御している。 That is, the control unit 100 of the terminal device 10 feeds back the emotion data and controls the discharge of the fragrance customized according to the fed-back emotion data.
 ステップS233の処理が終了すると、処理を終了する。 When the process of step S233 is completed, the process is completed.
 以上、香りカスタマイズ処理の例の流れを説明した。この香りカスタマイズ処理では、端末装置10で吐出される香りが、前回までの相手の反応(正の感情又は負の感情)に応じて変更されるため、より良い香りでユーザの気持ちを支援することができる。 The flow of an example of scent customization processing has been explained above. In this scent customization process, the scent discharged by the terminal device 10 is changed according to the reaction (positive emotion or negative emotion) of the other party up to the previous time, so that the user's feelings can be supported with a better scent. Can be done.
<2.第2の実施の形態> <2. Second Embodiment>
(芳香支援処理の流れ)
 次に、図11のフローチャートを参照して、端末装置10の制御部100により実行される芳香支援処理の第2の例を説明する。
(Flow of fragrance support processing)
Next, a second example of the fragrance support process executed by the control unit 100 of the terminal device 10 will be described with reference to the flowchart of FIG.
 この芳香支援処理を実行するに際して、ユーザが身につけた端末装置10には、サーバ20から、当該ユーザが好きな人の画像データ等のパーソナルデータを用いて学習済みの判定モデルが提供され、制御部100により利用可能となっている。 When executing this fragrance support process, the terminal device 10 worn by the user is provided with a determined determination model learned from the server 20 using personal data such as image data of a person the user likes, and is controlled. It is made available by the unit 100.
 ステップS131乃至S135においては、図6のステップS101乃至S105と同様に、撮影画像データに判定モデルが適用され、ユーザの周囲の人から、当該ユーザが好きな人と類似の人が検出された場合に、その類似の人がユーザの方向を見ているかどうかが判定される。 In steps S131 to S135, as in steps S101 to S105 of FIG. 6, when the determination model is applied to the captured image data and a person similar to the person who likes the user is detected from the people around the user. In addition, it is determined whether or not the similar person is looking in the direction of the user.
 ステップS135の判定処理で、類似の人がユーザの方向を見ていると判定された場合、処理は、ステップS136に進められる。ステップS136においては、ステップS106の処理と同様に、香料吐出制御部155によって、香料吐出部123が制御され、ユーザが落ち着く香りが吐出される。 If it is determined in the determination process of step S135 that a similar person is looking in the direction of the user, the process proceeds to step S136. In step S136, the fragrance discharge unit 123 is controlled by the fragrance discharge control unit 155 as in the process of step S106, and the fragrance that calms the user is discharged.
 一方で、ステップS135の判定処理で、類似の人がユーザの方向を見てないと判定された場合、処理は、ステップS137に進められる。 On the other hand, if it is determined in the determination process of step S135 that a similar person does not look in the direction of the user, the process proceeds to step S137.
 ステップS137において、香料吐出制御部155は、香料吐出部123を制御して、ユーザが興奮する香りを吐出する。また、興奮する香りを吐出すると同時に、ユーザの容姿が、ターゲットとなる相手(当該ユーザが好きな人と類似の人)の好みに応じた服装や髪型、メイク、目の大きさや色などに変更されるようにする。 In step S137, the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge the scent that excites the user. In addition, at the same time as emitting an exciting scent, the appearance of the user is changed to clothes, hairstyle, makeup, eye size and color according to the preference of the target person (a person similar to the person who likes the user). To be done.
 具体的には、ターゲットとなる相手も、ユーザと同様に、眼鏡型デバイスとしての端末装置10を身につけている場合を想定すれば、ターゲットとなる相手がかけている眼鏡型デバイスで、AR技術などを用いて透過型のディスプレイ越しに見るユーザの容姿が変更されて見えるように、服装や髪型などの情報を描画すればよい。 Specifically, assuming that the target party also wears the terminal device 10 as a spectacle-type device like the user, the target partner is the spectacle-type device worn by the target partner, and the AR technology is used. Information such as clothes and hairstyle may be drawn so that the appearance of the user viewed through the transparent display can be changed by using the above.
 あるいは、例えば、コンタクトレンズでユーザの目の色を変化させたり、ユーザの身体の一部の形状を変化させることが可能な下着を利用して身体の一部の形状を強調させたりするなど、現実空間のユーザの見た目を変更しても構わない。 Alternatively, for example, contact lenses can be used to change the color of the user's eyes, or underwear that can change the shape of a part of the user's body can be used to emphasize the shape of a part of the body. You may change the appearance of the user in the real space.
 なお、ターゲットとなる相手の好みに関する情報は、当該相手が身につけているウェアラブル端末から送信されてくる感情データを用いて推定したり、インターネット上に公開されている当該相手に関する情報(例えばソーシャルメディアの情報)などを用いて推定したりすることができる。あるいは、サーバ20が、ターゲットとなる相手の好みに関する情報をデータベースで管理したり、学習処理を行うに際してターゲットとなる相手の好みに関する情報を学習データに含めたりしてもよい。 Information about the preference of the target person can be estimated using emotional data transmitted from the wearable terminal worn by the person, or information about the person published on the Internet (for example, social media). Information) can be used for estimation. Alternatively, the server 20 may manage information on the preference of the target partner in the database, or may include information on the preference of the target partner in the learning data when performing the learning process.
 すなわち、ターゲットとなる相手が身につけている端末装置10の制御部100では、ターゲットがユーザの方向を見ていない場合、ターゲットから注目されるように、ターゲットから見たときのユーザの外見を変更している。 That is, the control unit 100 of the terminal device 10 worn by the target partner changes the appearance of the user when viewed from the target so that the target pays attention when the target does not look in the direction of the user. doing.
 ステップS136,又はS137の処理が終了すると、それ以降の処理が実行されるが、図7のステップS108乃至S114の処理と同様であるため、その説明は省略する。 When the processing of steps S136 or S137 is completed, the subsequent processing is executed, but since it is the same as the processing of steps S108 to S114 of FIG. 7, the description thereof will be omitted.
 以上、芳香支援処理の第2の例の流れを説明した。この芳香支援処理の第2の例では、眼鏡型デバイス等の端末装置10を身につけたユーザの周囲の人から、当該ユーザが最も自分の好みに合った相手を探索する際に、香りによりユーザの気持ちを支援するとともに、当該ユーザがその相手にとって好ましい人に変化することができる。 The flow of the second example of the aroma support treatment has been explained above. In the second example of this fragrance support process, when the user searches for a person who best suits his / her taste from the people around the user who wears the terminal device 10 such as a spectacle-type device, the user uses the scent. It is possible to support the feelings of the user and change the user into a person who is preferable to the other party.
<3.第3の実施の形態> <3. Third Embodiment>
(芳香支援処理の流れ)
 次に、図12のフローチャートを参照して、端末装置10の制御部100により実行される芳香支援処理の第3の例を説明する。
(Flow of fragrance support processing)
Next, a third example of the fragrance support process executed by the control unit 100 of the terminal device 10 will be described with reference to the flowchart of FIG.
 この芳香支援処理を実行するに際して、ユーザが身につけた端末装置10には、サーバ20から、ユーザが苦手な人の画像データ等のパーソナルデータを用いて学習済みの判定モデルが提供され、制御部100により利用可能となっている。 When executing this fragrance support process, the terminal device 10 worn by the user is provided with a determination model learned from the server 20 using personal data such as image data of a person who is not good at the user, and is provided with a control unit. It is available by 100.
 ステップS161において、センサデータ取得部151は、カメラ部112によりリアルタイムで撮影された撮影画像データを取得する。 In step S161, the sensor data acquisition unit 151 acquires captured image data captured in real time by the camera unit 112.
 ステップS162において、判定モデル適用部152は、取得した撮影画像データに、学習済みの判定モデルを適用する。 In step S162, the determination model application unit 152 applies the learned determination model to the acquired captured image data.
 ステップS163において、状況分析部154は、判定モデルの判定結果のデータや撮影画像データ等に基づいて、ユーザを中心に、半径10m以内にいる周囲の人から、当該ユーザが苦手な人を検出したかどうかを判定する。 In step S163, the situation analysis unit 154 detected a person who is not good at the user from the surrounding people within a radius of 10 m, centered on the user, based on the judgment result data of the judgment model, the photographed image data, and the like. Determine if.
 ステップS163の判定処理で、苦手な人を未検出であると判定された場合、処理は、ステップS161に戻り、ステップS161乃至S163の処理が繰り返される。 If it is determined in the determination process of step S163 that a person who is not good at it has not been detected, the process returns to step S161, and the processes of steps S161 to S163 are repeated.
 ステップS163の判定処理で、苦手な人を検出したと判定された場合、処理は、ステップS164に進められる。ステップS164において、状況分析部154は、判定モデルの判定結果のデータや撮影画像データに基づいて、ユーザが苦手な人が向いている方向を分析する。 If it is determined in the determination process of step S163 that a person who is not good at it is detected, the process proceeds to step S164. In step S164, the situation analysis unit 154 analyzes the direction in which the person who is not good at the user is facing, based on the determination result data of the determination model and the captured image data.
 ステップS165において、状況分析部154は、苦手な人が向いている方向の分析結果に基づいて、ユーザが苦手な人がユーザの方向を見ているかどうかを判定する。 In step S165, the situation analysis unit 154 determines whether or not the person who is not good at the user is looking in the direction of the user based on the analysis result of the direction in which the person who is not good at it is facing.
 ステップS165の判定処理で、苦手な人がユーザの方向を見ていると判定された場合、処理は、ステップS166に進められる。ステップS166において、香料吐出制御部155は、香料吐出部123を制御して、ユーザが落ち着く香りを吐出する。 If it is determined in the determination process of step S165 that a person who is not good at it is looking in the direction of the user, the process proceeds to step S166. In step S166, the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge a calming scent to the user.
 このように、自分が苦手な人がこちらを見ている場合に、落ち着く香りが吐出されるため、ユーザは、その香りによって気持ちを落ち着かせて、次の行動を行うことができる。 In this way, when a person who is not good at himself / herself is looking at this, a calming scent is emitted, so that the user can calm down by the scent and take the next action.
 一方で、ステップS165の判定処理で、苦手な人がユーザの方向を見ていないと判定された場合、処理は、ステップS167に進められる。ステップS167において、香料吐出制御部155は、香料吐出部123を制御して、迅速な行動を促す覚醒の香りを吐出する。 On the other hand, if it is determined in the determination process of step S165 that a person who is not good at it is not looking in the direction of the user, the process proceeds to step S167. In step S167, the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to discharge the awakening scent that encourages quick action.
 このように、自分が苦手な人がこちらを見ていない場合には、迅速な行動を促す覚醒の香りが吐出されるため、ユーザは、その香りによって急いで動いて、直ちに自分が苦手な人から離れた場所に移動することができる。例えば、この覚醒の香りとしては、ライブラリに保持されている情報に基づき、ミントの香りなど吐出することができる。 In this way, when a person who is not good at himself is not looking at this, the scent of awakening that encourages quick action is emitted, so that the user moves quickly by the scent and immediately the person who is not good at himself. You can move to a place away from. For example, as the scent of awakening, the scent of mint or the like can be discharged based on the information stored in the library.
 ステップS166,又はS167の処理が終了すると、処理は、ステップS168に進められる。 When the process of steps S166 or S167 is completed, the process proceeds to step S168.
 ステップS168において、状況分析部154は、判定モデルの判定結果のデータや撮影画像データ等に基づいて、苦手な人との距離が3m以内に近づいたかどうかを判定する。 In step S168, the situation analysis unit 154 determines whether or not the distance to the person who is not good is within 3 m based on the determination result data of the determination model, the captured image data, and the like.
 ステップS168の判定処理では、苦手な人との距離が3m以内に近づくのを待って、処理は、ステップS169に進められる。ステップS169において、香料吐出制御部155は、香料吐出部123を制御して、吐出する香りを強くする。 In the determination process of step S168, the process proceeds to step S169 after waiting for the distance to the person who is not good at approaching within 3 m. In step S169, the fragrance discharge control unit 155 controls the fragrance discharge unit 123 to enhance the scent to be discharged.
 このように、苦手な人がユーザに近づくほど、吐出している香りの吐出量を増やして、より強調された香りでユーザを支援することができる。 In this way, the closer a person who is not good at the user gets to the user, the more the amount of the scent that is discharged can be increased, and the user can be supported with a more emphasized scent.
 ステップS169の処理が終了すると、処理を終了する。 When the process of step S169 is completed, the process is completed.
 以上、芳香支援処理の第3の例の流れを説明した。この芳香支援処理の第3の例では、眼鏡型デバイス等の端末装置10を身につけたユーザの周囲の人から、当該ユーザが苦手な人を探索する際に、香りによりユーザの気持ちを支援するとともに、その苦手な人に遭遇しないようにすることができる。 The flow of the third example of the aroma support treatment has been explained above. In the third example of this fragrance support process, when searching for a person who is not good at the user from the people around the user who wears the terminal device 10 such as a spectacle-type device, the scent supports the user's feelings. At the same time, it is possible to avoid encountering the person who is not good at it.
(学習処理の流れ)
 図13は、サーバ20の制御部200により実行される学習処理の第2の例を説明するフローチャートである。
(Flow of learning process)
FIG. 13 is a flowchart illustrating a second example of the learning process executed by the control unit 200 of the server 20.
 この学習処理は、上述した芳香支援処理(図12)に先立って実行され、ここで生成した学習済みの判定モデルが、端末装置10の制御部100に提供される。 This learning process is executed prior to the above-mentioned fragrance support process (FIG. 12), and the learned determination model generated here is provided to the control unit 100 of the terminal device 10.
 ステップS261において、学習データ取得部251は、ユーザが苦手な人の画像データ等のパーソナルデータを取得する。 In step S261, the learning data acquisition unit 251 acquires personal data such as image data of a person who is not good at the user.
 ステップS262において、学習部253は、取得した画像データ等のパーソナルデータを用いた学習を行い、判定モデルを生成する。 In step S262, the learning unit 253 performs learning using personal data such as acquired image data, and generates a determination model.
 ステップS263において、提供部254は、生成した学習済みの判定モデルを、端末装置10の制御部100に提供する。 In step S263, the providing unit 254 provides the generated learned determination model to the control unit 100 of the terminal device 10.
 ステップS263の処理が終了すると、処理を終了する。 When the process of step S263 is completed, the process is completed.
 以上、学習処理の第2の例の流れを説明した。この学習処理の第2の例では、制御部100が芳香支援処理(図12)を実行するために必要となる判定モデルが生成され、端末装置10に提供される。 The flow of the second example of the learning process has been explained above. In the second example of this learning process, a determination model required for the control unit 100 to execute the fragrance support process (FIG. 12) is generated and provided to the terminal device 10.
<4.第4の実施の形態> <4. Fourth Embodiment>
(サーバの構成)
 図14は、サーバ20に含まれる制御部200の機能的構成の他の例を示している。
(Server configuration)
FIG. 14 shows another example of the functional configuration of the control unit 200 included in the server 20.
 図14において、サーバ20は、ターゲットデータ取得部261、マッチング用DB262、マッチング処理部263、及びマッチング結果提供部264を有する。 In FIG. 14, the server 20 has a target data acquisition unit 261, a matching DB 262, a matching processing unit 263, and a matching result providing unit 264.
 ターゲットデータ取得部261は、通信部208を制御して、端末装置10から送信されてくるターゲットデータを取得し、マッチング処理部263に供給する。 The target data acquisition unit 261 controls the communication unit 208 to acquire the target data transmitted from the terminal device 10 and supplies it to the matching processing unit 263.
 このターゲットデータは、あるユーザが好きな人と類似の人に関するデータを含んでいる。 This target data includes data about a person who is similar to a person who likes a user.
 マッチング用DB262は、マッチング処理に必要なデータを格納している。この必要なデータとしては、例えば、ユーザごとの相性を分析するためのデータなどを含む。 The matching DB 262 stores the data required for the matching process. This necessary data includes, for example, data for analyzing compatibility for each user.
 マッチング処理部263は、マッチング用DB262に格納されたデータを参照して、ターゲットデータ取得部から供給されるターゲットデータを用いたマッチング処理を行う。マッチング処理部263は、マッチング処理結果のデータを、マッチング結果提供部264に供給する。 The matching processing unit 263 refers to the data stored in the matching DB 262 and performs matching processing using the target data supplied from the target data acquisition unit. The matching processing unit 263 supplies the matching processing result data to the matching result providing unit 264.
 マッチング結果提供部264は、通信部208を制御して、マッチング処理部263から供給されるマッチング処理結果のデータを、端末装置10に提供する。 The matching result providing unit 264 controls the communication unit 208 to provide the matching processing result data supplied from the matching processing unit 263 to the terminal device 10.
(マッチングサービス処理の流れ)
 次に、図15と図16のフローチャートを参照して、複数の端末装置10とサーバ20との間で実行される、マッチングサービス処理の例を説明する。
(Flow of matching service processing)
Next, an example of matching service processing executed between the plurality of terminal devices 10 and the server 20 will be described with reference to the flowcharts of FIGS. 15 and 16.
 このマッチングサービス処理では、端末装置10を身につけたユーザ間で好みの人をマッチングするための処理が行われる。 In this matching service process, a process for matching a favorite person among users wearing the terminal device 10 is performed.
 図15と図16においては、ユーザが身につけた端末装置10-1と、当該ユーザの好みの人A,Bが身につけた端末装置10-2,10-3により実行される処理がそれぞれ示されている。 15 and 16 show the processes executed by the terminal device 10-1 worn by the user and the terminal devices 10-2 and 10-3 worn by the user's favorite persons A and B, respectively. Has been done.
 また、マッチングサービス処理を実行するに際して、端末装置10-1乃至10-3のそれぞれには、各端末装置10を使用するユーザが好きな人の画像データ等のパーソナルデータを用いて学習済みの判定モデルが提供され、各端末装置10の制御部100により利用可能となっている。 Further, when executing the matching service process, each of the terminal devices 10-1 to 10-3 is determined to have been learned by using personal data such as image data of a person who likes the user who uses each terminal device 10. A model is provided and made available by the control unit 100 of each terminal device 10.
 ユーザが身につけた端末装置10-1では、図15のステップS301乃至S304の処理が実行される。 In the terminal device 10-1 worn by the user, the processes of steps S301 to S304 of FIG. 15 are executed.
 すなわち、端末装置10-1では、リアルタイムで撮影された撮影画像データに、学習済みの判定モデルを適用することで(S301,S302)、ユーザの周囲の人から、当該ユーザが好きな人と類似の人を検出したかどうかが判定される(S303)。そして、端末装置10-1では、ユーザが好きな人と類似の人を検出した場合には、その類似の人に関するデータが、ターゲットデータとして、サーバ20に送信される(S304)。 That is, in the terminal device 10-1, by applying the learned determination model to the captured image data captured in real time (S301, S302), the people around the user are similar to the person who likes the user. It is determined whether or not the person has been detected (S303). Then, when the terminal device 10-1 detects a person similar to the person the user likes, data about the similar person is transmitted to the server 20 as target data (S304).
 ユーザの好みの人Aが身につけた端末装置10-2では、図15のステップS311乃至S314の処理が実行される。 In the terminal device 10-2 worn by the user's favorite person A, the processes of steps S311 to S314 of FIG. 15 are executed.
 すなわち、端末装置10-2では、リアルタイムで撮影された撮影画像データに、学習済みの判定モデルを適用することで(S311,S312)、好みの人Aの周囲の人から、当該好みの人Aが好きな人と類似の人を検出したかどうかが判定される(S313)。そして、端末装置10-2では、好みの人Aが好きな人と類似の人を検出した場合には、その類似の人に関するデータが、ターゲットデータとして、サーバ20に送信される(S314)。 That is, in the terminal device 10-2, by applying the trained determination model to the captured image data captured in real time (S311, S312), the person A around the favorite person A can be referred to as the favorite person A. It is determined whether or not a person similar to the person who likes is detected (S313). Then, when the favorite person A detects a person similar to the favorite person in the terminal device 10-2, the data about the similar person is transmitted to the server 20 as the target data (S314).
 ユーザの好みの人Bが身につけた端末装置10-3では、図15のステップS321乃至S324の処理が実行される。 In the terminal device 10-3 worn by the user's favorite person B, the processes of steps S321 to S324 of FIG. 15 are executed.
 すなわち、端末装置10-3では、リアルタイムで撮影された撮影画像データに、学習済みの判定モデルを適用することで(S321,S322)、好みの人Bの周囲の人から、当該好みの人Bが好きな人と類似の人を検出したかどうかが判定される(S323)。そして、端末装置10-3では、好みの人Bが好きな人と類似の人を検出した場合には、その類似の人に関するデータが、ターゲットデータとして、サーバ20に送信される(S324)。 That is, in the terminal device 10-3, by applying the trained determination model to the captured image data captured in real time (S321, S322), the person B around the favorite person B can be referred to. It is determined whether or not a person similar to the person who likes is detected (S323). Then, when the favorite person B detects a person similar to the favorite person in the terminal device 10-3, the data about the similar person is transmitted to the server 20 as the target data (S324).
 端末装置10-1乃至10-3から送信されるターゲットデータを受信するサーバ20では、図16のステップS331乃至S333の処理が実行される。 The server 20 that receives the target data transmitted from the terminal devices 10-1 to 10-3 executes the processes of steps S331 to S333 of FIG.
 ステップS331において、ターゲットデータ取得部261は、端末装置10-1乃至10-3から送信されてくるターゲットデータを取得する。 In step S331, the target data acquisition unit 261 acquires the target data transmitted from the terminal devices 10-1 to 10-3.
 ステップS332において、マッチング処理部263は、マッチング用DB262に格納されたデータを参照して、取得したターゲットデータを用いたマッチング処理を行う。 In step S332, the matching processing unit 263 refers to the data stored in the matching DB 262 and performs a matching process using the acquired target data.
 このマッチング処理では、例えば、その時点でのユーザ間の距離や、ユーザごとの相性などの情報に基づき、各ユーザの優劣の順位が算出され、算出された優劣の順位が最も優位となる人が、マッチングの相手として決定される。ここでの相性としては、例えば、画像のお互いの好みの度合いや、嗜好の一致の度合いや、DNA情報の一致(又は不一致)の度合いなどを含めることができる。 In this matching process, for example, the ranking of superiority or inferiority of each user is calculated based on information such as the distance between users at that time and the compatibility of each user, and the person whose calculated superiority or inferiority ranking is the most superior. , Determined as a matching partner. As the compatibility here, for example, the degree of mutual preference of the images, the degree of agreement of tastes, the degree of agreement (or disagreement) of DNA information, and the like can be included.
 すなわち、制御部200では、マッチングに際して、相性とともに、マッチングするユーザの間のその時点での距離に基づき、最もマッチ度の高いユーザを、マッチングの相手に決定することができる。 That is, in the matching unit 200, the user with the highest degree of matching can be determined as the matching partner based on the compatibility and the distance between the matching users at that time.
 ステップS333において、マッチング結果提供部264は、マッチング処理で得られたマッチング結果のデータを、マッチング結果に応じた端末装置10に送信する。例えば、マッチング処理によって、ユーザと好みの人Aがマッチングされた場合、マッチングされた人が使用している端末装置10-1と端末装置10-2に対し、マッチング結果のデータがそれぞれ送信される。 In step S333, the matching result providing unit 264 transmits the matching result data obtained in the matching process to the terminal device 10 according to the matching result. For example, when the user and the favorite person A are matched by the matching process, the matching result data is transmitted to the terminal device 10-1 and the terminal device 10-2 used by the matched person, respectively. ..
 サーバ20から送信されるマッチング結果のデータを受信する端末装置10-1と端末装置10-2のそれぞれでは、図16のステップS305乃至S308の処理が実行される。 The processing of steps S305 to S308 of FIG. 16 is executed in each of the terminal device 10-1 and the terminal device 10-2 that receive the matching result data transmitted from the server 20.
 すなわち、端末装置10-1と端末装置10-2では、各々の香料吐出制御部155によって、各々の香料吐出部123が制御されることで、祝福の香りがそれぞれ吐出される(S305)。この祝福の香りは、各々の香料吐出制御部155によって、ユーザと好みの人Aの2人の距離が近くなるにつれて香りが強くなるように制御される(S305)。 That is, in the terminal device 10-1 and the terminal device 10-2, the fragrance discharge unit 123 is controlled by each fragrance discharge control unit 155, so that the scent of blessing is discharged (S305). The scent of this blessing is controlled by each fragrance discharge control unit 155 so that the scent becomes stronger as the distance between the user and the favorite person A becomes closer (S305).
 また、端末装置10-1と端末装置10-2では、各々の状況分析部154によって、感情データ等のデータに基づき、ユーザと好みの人Aの2人のうち、どちらか一方に違和感があるかどうかを判定する(S306)。 Further, in the terminal device 10-1 and the terminal device 10-2, each situation analysis unit 154 has a feeling of strangeness in either the user or the favorite person A based on the data such as emotion data. Whether or not it is determined (S306).
 なお、この違和感があるかどうかの判定処理は、サーバ20が感情データ等のデータを取得することで、サーバ20により実行されてもよい。この場合、端末装置10-1と端末装置10-2は、サーバ20から送信されてくる、違和感に関する判定結果のデータを受信することになる。 Note that the process of determining whether or not there is a sense of discomfort may be executed by the server 20 by the server 20 acquiring data such as emotional data. In this case, the terminal device 10-1 and the terminal device 10-2 receive the data of the determination result regarding the discomfort, which is transmitted from the server 20.
 ステップS306の判定処理で、2人のうちのどちらか一方に違和感がない、つまり、2人とも違和感がないと判定された場合、処理は、ステップS307に進められる。そして、端末装置10-1と端末装置10-2では、2人の距離が半径1m以内になったとき、各々の香料吐出制御部155によって、各々の香料吐出部123が制御されることで、情熱的な香りが吐出される(S307)。 If it is determined in the determination process of step S306 that one of the two persons does not feel uncomfortable, that is, it is determined that neither of the two persons feels uncomfortable, the process proceeds to step S307. Then, in the terminal device 10-1 and the terminal device 10-2, when the distance between the two people is within a radius of 1 m, each fragrance discharge control unit 155 controls each fragrance discharge unit 123. A passionate scent is discharged (S307).
 なお、ここでも、半径1m以内等の所定の範囲内にいる相手の人を検出する際には、撮影画像データを用いた一定距離内の人の検出処理を行うほか、例えば、端末装置10-1と端末装置10-2のそれぞれで取得される位置データ(例えば、GPSによる位置データ)を照合したり、Bluetooth(登録商標)等の近距離無線通信による電波捕捉を行ったりしてもよい。 Also here, when detecting the other person within a predetermined range such as within a radius of 1 m, the detection process of the person within a certain distance using the captured image data is performed, and for example, the terminal device 10- The position data (for example, the position data by GPS) acquired by each of 1 and the terminal device 10-2 may be collated, or radio waves may be captured by short-range wireless communication such as Bluetooth (registered trademark).
 一方で、ステップS306の判定処理で、2人のうちのどちらか一方に違和感があると判定された場合、処理は、ステップS308に進められる。そして、端末装置10-1と端末装置10-2では、各々の香料吐出制御部155によって、各々の香料吐出部123が制御されることで、お断りの香りが吐出される(S308)。 On the other hand, if it is determined in the determination process of step S306 that one of the two persons feels uncomfortable, the process proceeds to step S308. Then, in the terminal device 10-1 and the terminal device 10-2, the fragrance discharge unit 123 is controlled by each fragrance discharge control unit 155, so that the refused scent is discharged (S308).
 このように、マッチングの対象の2人がマッチング結果に満足した場合に、祝福の香りや情熱的な香りによって、2人の気持ちを前向きに支援することができる。一方で、仮に、マッチングの対象の2人のうち、少なくとも一方がマッチング結果に違和感があったとしても、お断りの香りによって、少なくとも違和感がある人の気持ちを支援することができる。 In this way, when two people to be matched are satisfied with the matching result, the feelings of the two people can be positively supported by the scent of blessing and the scent of passion. On the other hand, even if at least one of the two matching targets has a feeling of strangeness in the matching result, the scent of refusal can support the feelings of at least the person who has a feeling of strangeness.
 すなわち、制御部200では、各ユーザの端末装置10から送信されてくるターゲットに関するターゲットデータが取得され、ユーザごとのターゲットデータに基づき、各ユーザの相性に応じたマッチングが行われる。 That is, the control unit 200 acquires target data related to the target transmitted from the terminal device 10 of each user, and matches according to the compatibility of each user based on the target data for each user.
 また、制御部100では、マッチングされたユーザの端末装置10のセンサデータから得られる、マッチングされたユーザ同士の距離に関する距離情報に基づき、香料の吐出が制御される。また、制御部100では、マッチングされたユーザ同士の距離が近くなるにつれて、所定の種類の香料の吐出量を増加させている。さらに、制御部100では、距離情報の示す値が、所定の閾値未満となる場合、所定の種類の香料を吐出させている。 Further, the control unit 100 controls the discharge of the fragrance based on the distance information regarding the distance between the matched users obtained from the sensor data of the terminal device 10 of the matched users. Further, the control unit 100 increases the discharge amount of a predetermined type of fragrance as the distance between the matched users becomes closer. Further, the control unit 100 discharges a predetermined type of fragrance when the value indicated by the distance information is less than a predetermined threshold value.
 また、制御部100では、マッチングされたユーザの相手に対する感情に関する感情データに基づき、マッチングされたユーザの少なくとも一方に違和感がある場合、所定の種類の香料を吐出させている。 Further, the control unit 100 discharges a predetermined type of fragrance when at least one of the matched users feels uncomfortable based on the emotion data regarding the emotions of the matched user toward the other party.
 ステップS307,又はS308の処理が終了すると、処理を終了する。 When the process of step S307 or S308 is completed, the process is completed.
 以上、マッチングサービス処理の流れを説明する。このマッチングサービス処理では、各ユーザが身につけている端末装置10から送信されてくるターゲットデータを用いたユーザ間のマッチングを行う際に、香りによって、マッチングされた2人のユーザの気持ちを支援することができる。そして、この香りの支援によって、マッチングが成功する確率を向上させて、ひいては、少子化の問題を解決することが期待される。 The flow of matching service processing will be explained above. In this matching service process, when matching between users using the target data transmitted from the terminal device 10 worn by each user, the feelings of the two matched users are supported by the scent. be able to. And, with the support of this scent, it is expected that the probability of successful matching will be improved and, by extension, the problem of declining birthrate will be solved.
<5.変形例> <5. Modification example>
(芳香支援システムの他の構成)
 図17は、本技術を適用した芳香支援システムの一実施の形態の構成の他の例を示している。
(Other configurations of fragrance support system)
FIG. 17 shows another example of the configuration of one embodiment of the fragrance support system to which the present technology is applied.
 図17において、芳香支援システム1には、端末装置10とサーバ20のほかに、情報装置30と香料吐出装置40が設けられている。端末装置10と、情報装置30と、香料吐出装置40は、1人のユーザにより所持される。 In FIG. 17, the fragrance support system 1 is provided with an information device 30 and a fragrance discharge device 40 in addition to the terminal device 10 and the server 20. The terminal device 10, the information device 30, and the fragrance discharging device 40 are possessed by one user.
 情報装置30は、例えば、スマートフォンやタブレット端末、携帯音楽プレーヤ、ゲーム機などとして構成される。情報装置30は、通信機能を有し、ネットワーク50に接続可能であるとともに、Bluetooth(登録商標)等の近距離無線通信により、端末装置10と通信可能である。 The information device 30 is configured as, for example, a smartphone, a tablet terminal, a portable music player, a game machine, or the like. The information device 30 has a communication function, can be connected to the network 50, and can communicate with the terminal device 10 by short-range wireless communication such as Bluetooth (registered trademark).
 香料吐出装置40は、丸棒や角棒等のスティック状の形状からなり、香料吐出部123(図2)と同様の機能を有している。そのため、図2に示した端末装置10の構成では、香料吐出部123(図2)が取り除かれるものとする。 The fragrance discharge device 40 has a stick-like shape such as a round bar or a square bar, and has the same function as the fragrance discharge unit 123 (FIG. 2). Therefore, in the configuration of the terminal device 10 shown in FIG. 2, the fragrance discharge unit 123 (FIG. 2) is removed.
 すなわち、香料吐出装置40は、液体香料が充填される香料保持構造体を有し、香料保持構造体に設けられた保持空間に空気を流して、当該保持空間に配置された香料保持体に保持された液体香料を気化させることで放出する。 That is, the fragrance discharge device 40 has a fragrance holding structure filled with a liquid fragrance, and air is flowed through a holding space provided in the fragrance holding structure to be held by the fragrance holding body arranged in the holding space. It is released by vaporizing the liquid fragrance.
 香料吐出装置40はまた、通信機能を有し、Bluetooth(登録商標)等の近距離無線通信により、端末装置10と通信可能である。 The fragrance discharge device 40 also has a communication function and can communicate with the terminal device 10 by short-range wireless communication such as Bluetooth (registered trademark).
 図17の構成の例では、端末装置10は、情報装置30を介して、ネットワーク50上のサーバ20との間でデータをやりとりして、上述した芳香支援処理(図6,図7等)を実行する。また、端末装置10は、芳香支援処理を実行するに際して、各種の香りを吐出するとき(図6のS106,S107,図7のS113,S114等)、香料吐出制御部155(図3)によって、香料吐出装置40を制御することで、香りが吐出されるようにする。 In the example of the configuration of FIG. 17, the terminal device 10 exchanges data with the server 20 on the network 50 via the information device 30 to perform the above-mentioned fragrance support process (FIGS. 6, 7, etc.). Execute. Further, when the terminal device 10 discharges various scents when executing the fragrance support process (S106, S107 in FIG. 6, S113, S114 in FIG. 7, etc.), the fragrance discharge control unit 155 (FIG. 3) determines. By controlling the fragrance discharge device 40, the scent is discharged.
 このように、芳香支援システム1において、芳香支援処理を実行する装置と、香料を吐出する装置とが必ずしも同一の装置である必要はない。なお、図17の構成の例では、スマートフォン等の情報装置30を設けた構成を示したが、情報装置30を設けずに、端末装置10が直接、ネットワーク50上のサーバ20との間でデータをやりとりすることも勿論可能である。 As described above, in the fragrance support system 1, the device that executes the fragrance support process and the device that discharges the fragrance do not necessarily have to be the same device. In addition, in the example of the configuration of FIG. 17, the configuration in which the information device 30 such as a smartphone is provided is shown, but the terminal device 10 directly communicates with the server 20 on the network 50 without providing the information device 30. Of course, it is also possible to exchange.
 なお、上述した説明では、図3の制御部100が端末装置10に含まれ、図5の制御部200がサーバ20に含まれるとして説明したが、図3の制御部100の機能の全部又は一部がサーバ20に含まれたり、図5の制御部200の機能の全部又は一部が端末装置10に含まれたりしてもよい。 In the above description, the control unit 100 of FIG. 3 is included in the terminal device 10, and the control unit 200 of FIG. 5 is included in the server 20, but all or one of the functions of the control unit 100 of FIG. The unit may be included in the server 20, or all or part of the functions of the control unit 200 of FIG. 5 may be included in the terminal device 10.
 例えば、図3の判定モデル適用部152や、感情データ取得部153がサーバ20(の制御部200)に設けられても構わない。この場合において、端末装置10は、判定モデルを適用するデータを、ネットワーク50を介してサーバ20に送信することで、サーバ20から、学習済みの判定モデルの判定結果のデータを受信することになる。 For example, the determination model application unit 152 of FIG. 3 and the emotion data acquisition unit 153 may be provided in the server 20 (control unit 200). In this case, the terminal device 10 transmits the data to which the determination model is applied to the server 20 via the network 50, so that the terminal device 10 receives the data of the determination result of the learned determination model from the server 20. ..
 また、図10に示した香りのカスタマイズ処理が、各ユーザが身につけている端末装置10の制御部100により実行されても構わない。すなわち、上述のフローチャートの各処理は、端末装置10の制御部100、又はサーバ20の制御部200により実行されるか、あるいは制御部100と制御部200が協働することで実行される。 Further, the scent customization process shown in FIG. 10 may be executed by the control unit 100 of the terminal device 10 worn by each user. That is, each process of the above-mentioned flowchart is executed by the control unit 100 of the terminal device 10 or the control unit 200 of the server 20, or is executed by the cooperation of the control unit 100 and the control unit 200.
<6.コンピュータの構成> <6. Computer configuration>
 上述した端末装置10又はサーバ20一連の処理は、ハードウェアにより実行することもできるし、ソフトウェアにより実行することもできる。一連の処理をソフトウェアにより実行する場合には、そのソフトウェアを構成するプログラムが、各装置のコンピュータにインストールされる。 The series of processes of the terminal device 10 or the server 20 described above can be executed by hardware or software. When a series of processes are executed by software, the programs constituting the software are installed on the computer of each device.
 各装置のコンピュータでは、CPUが、ROMや記憶部に記録されているプログラムを、RAMにロードして実行することにより、上述した一連の処理が行われる。 In the computer of each device, the CPU loads the program recorded in the ROM or the storage unit into the RAM and executes it, so that the above-mentioned series of processes is performed.
 コンピュータ(CPU)が実行するプログラムは、例えば、パッケージメディア等としてのリムーバブル記録媒体に記録して提供することができる。また、プログラムは、ローカルエリアネットワーク、インターネット、デジタル衛星放送といった、有線又は無線の伝送媒体を介して提供することができる。 A program executed by a computer (CPU) can be recorded and provided on a removable recording medium such as a package medium. The program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
 コンピュータでは、プログラムは、リムーバブル記録媒体をドライブに装着することにより、記憶部にインストールすることができる。また、プログラムは、有線又は無線の伝送媒体を介して、通信部で受信し、記憶部にインストールすることができる。その他、プログラムは、ROMや記憶部に、あらかじめインストールしておくことができる。 On a computer, the program can be installed in the storage unit by attaching a removable recording medium to the drive. Further, the program can be received by the communication unit and installed in the storage unit via a wired or wireless transmission medium. In addition, the program can be installed in advance in the ROM or storage unit.
 ここで、本明細書において、コンピュータがプログラムに従って行う処理は、必ずしもフローチャートとして記載された順序に沿って時系列に行われる必要はない。すなわち、コンピュータがプログラムに従って行う処理は、並列的あるいは個別に実行される処理(例えば、並列処理あるいはオブジェクトによる処理)も含む。 Here, in the present specification, the processing performed by the computer according to the program does not necessarily have to be performed in chronological order in the order described as the flowchart. That is, the processing performed by the computer according to the program also includes processing executed in parallel or individually (for example, parallel processing or processing by an object).
 また、プログラムは、1のコンピュータ(プロセッサ)により処理されてもよいし、複数のコンピュータによって分散処理されてもよい。さらに、プログラムは、遠方のコンピュータに転送されて実行されてもよい。 Further, the program may be processed by one computer (processor) or may be distributed processed by a plurality of computers. In addition, the program may be transferred to a distant computer for execution.
 さらに、本明細書において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 Further, in the present specification, the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
 なお、本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。例えば、本技術は、1つの機能を、ネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。 The embodiment of the present technology is not limited to the above-described embodiment, and various changes can be made without departing from the gist of the present technology. For example, the present technology can have a cloud computing configuration in which one function is shared and jointly processed by a plurality of devices via a network.
 また、上述のフローチャートで説明した各ステップは、1つの装置で実行するほか、複数の装置で分担して実行することができる。さらに、1つのステップに複数の処理が含まれる場合には、その1つのステップに含まれる複数の処理は、1つの装置で実行する他、複数の装置で分担して実行することができる。 In addition, each step described in the above flowchart can be executed by one device or shared by a plurality of devices. Further, when a plurality of processes are included in one step, the plurality of processes included in the one step can be executed by one device or shared by a plurality of devices.
 また、本明細書に記載された効果はあくまで例示であって限定されるものではなく、他の効果があってもよい。 Further, the effects described in the present specification are merely examples and are not limited, and other effects may be obtained.
 なお、本技術は、以下のような構成をとることができる。 Note that this technology can have the following configuration.
(1)
 ユーザごとのパーソナルデータを用いた機械学習により生成した判定モデルを用い、入力されたセンサデータから推定されるターゲットに対する前記ユーザを支援するための香料の吐出を制御する制御部を備える
 情報処理装置。
(2)
 前記パーソナルデータは、前記ユーザについての他の人との相性に関するデータを含み、
 前記センサデータは、前記ユーザの周囲の撮影画像データを含み、
 前記制御部は、前記判定モデルを用い、入力された前記撮影画像データから推定される、前記相性に応じた周囲の人をターゲットとして行動を行う前記ユーザに対して、前記香料の吐出を制御する
 前記(1)に記載の情報処理装置。
(3)
 前記制御部は、前記センサデータから得られる、前記ユーザに対する前記ターゲットの行動に関する行動情報に基づいて、前記香料の吐出を制御する
 前記(2)に記載の情報処理装置。
(4)
 前記制御部は、前記ターゲットが前記ユーザの方向を見ているか否かに応じて、前記香料の種類を変更する
 前記(3)に記載の情報処理装置。
(5)
 前記制御部は、前記ターゲットが前記ユーザの方向を見ていない場合、前記ターゲットから注目されるように、前記ターゲットから見たときの前記ユーザの外見を変更する
 前記(4)に記載の情報処理装置。
(6)
 前記制御部は、前記センサデータから得られる、前記ユーザと前記ターゲットとの間の距離に関する距離情報に基づいて、前記香料の吐出を制御する
 前記(2)乃至(5)のいずれかに記載の情報処理装置。
(7)
 前記制御部は、前記距離情報の示す値が、所定の閾値未満となる場合、前記香料の吐出量を増加させる
 前記(6)に記載の情報処理装置。
(8)
 前記制御部は、前記ターゲットである周囲の人による前記ユーザに対する感情に関する感情データに基づいて、前記香料の吐出を制御する
 前記(2)乃至(7)のいずれかに記載の情報処理装置。
(9)
 前記制御部は、前記ターゲットである周囲の人についての前記ユーザとの相性に応じて、前記香料の種類を変更する
 前記(8)に記載の情報処理装置。
(10)
 前記制御部は、
  前記感情データをフィードバックし、
  フィードバックした前記感情データに応じてカスタマイズされた前記香料の吐出を制御する
 前記(8)又は(9)に記載の情報処理装置。
(11)
 前記パーソナルデータは、前記ターゲットを推定するに際しての前記ユーザの希望条件を含む
 前記(1)乃至(10)のいずれかに記載の情報処理装置。
(12)
 前記制御部は、
  各ユーザの端末から送信されてくる前記ターゲットに関するターゲットデータを取得し、
  ユーザごとの前記ターゲットデータに基づいて、各ユーザの前記相性に応じたマッチングを行う
 前記(2)に記載の情報処理装置。
(13)
 前記制御部は、マッチングされたユーザの端末の前記センサデータから得られる、マッチングされたユーザ同士の距離に関する距離情報に基づいて、前記香料の吐出を制御する
 前記(12)に記載の情報処理装置。
(14)
 前記制御部は、マッチングされたユーザ同士の距離が近くなるにつれて、所定の種類の香料の吐出量を増加させる
 前記(13)に記載の情報処理装置。
(15)
 前記制御部は、前記距離情報の示す値が、所定の閾値未満となる場合、所定の種類の香料を吐出させる
 前記(13)又は(14)に記載の情報処理装置。
(16)
 前記制御部は、マッチングされたユーザの相手に対する感情に関する感情データに基づいて、マッチングされたユーザの少なくとも一方に違和感がある場合、所定の種類の香料を吐出させる
 前記(12)乃至(15)のいずれかに記載の情報処理装置。
(17)
 前記制御部は、マッチングに際して、前記相性とともに、マッチングするユーザの間のその時点での距離に基づき、最もマッチ度の高いユーザを、マッチングの相手に決定する
 前記(12)乃至(16)のいずれかに記載の情報処理装置。
(18)
 前記センサデータを検出するデバイスを有するウェアラブル端末として構成される
 前記(1)乃至(11)のいずれかに記載の情報処理装置。
(19)
 前記センサデータを検出するデバイスを有するウェアラブル端末と、ネットワークを介して接続されるサーバとして構成される
 前記(12)乃至(17)のいずれかに記載の情報処理装置。
(20)
 情報処理装置が、
 ユーザごとのパーソナルデータを用いた機械学習により生成した判定モデルを用い、入力されたセンサデータから推定されるターゲットに対する前記ユーザを支援するための香料の吐出を制御する
 情報処理方法。
(1)
An information processing device including a control unit that controls the discharge of a fragrance to support the user with respect to a target estimated from input sensor data using a judgment model generated by machine learning using personal data for each user.
(2)
The personal data includes data on the compatibility of the user with others.
The sensor data includes captured image data around the user.
The control unit uses the determination model to control the discharge of the fragrance to the user who acts by targeting the surrounding people according to the compatibility, which is estimated from the input captured image data. The information processing device according to (1) above.
(3)
The information processing device according to (2), wherein the control unit controls the discharge of the fragrance based on the behavior information regarding the behavior of the target with respect to the user obtained from the sensor data.
(4)
The information processing device according to (3), wherein the control unit changes the type of the fragrance depending on whether or not the target is looking in the direction of the user.
(5)
The information processing according to (4) above, wherein the control unit changes the appearance of the user when viewed from the target so that the target is noticed by the target when the target is not looking in the direction of the user. apparatus.
(6)
The control unit controls the discharge of the fragrance based on the distance information regarding the distance between the user and the target obtained from the sensor data according to any one of (2) to (5). Information processing device.
(7)
The information processing device according to (6) above, wherein the control unit increases the discharge amount of the fragrance when the value indicated by the distance information is less than a predetermined threshold value.
(8)
The information processing device according to any one of (2) to (7), wherein the control unit controls the discharge of the fragrance based on emotional data regarding emotions toward the user by a person around the target.
(9)
The information processing device according to (8), wherein the control unit changes the type of the fragrance according to the compatibility of the target surrounding person with the user.
(10)
The control unit
Feed back the emotional data
The information processing device according to (8) or (9), which controls the discharge of the fragrance customized according to the fed-back emotion data.
(11)
The information processing device according to any one of (1) to (10), wherein the personal data includes a desired condition of the user when estimating the target.
(12)
The control unit
Acquire the target data related to the target transmitted from each user's terminal, and
The information processing device according to (2) above, which performs matching according to the compatibility of each user based on the target data for each user.
(13)
The information processing device according to (12), wherein the control unit controls the discharge of the fragrance based on the distance information regarding the distance between the matched users obtained from the sensor data of the terminal of the matched user. ..
(14)
The information processing device according to (13), wherein the control unit increases the discharge amount of a predetermined type of fragrance as the distance between the matched users becomes closer.
(15)
The information processing device according to (13) or (14), wherein the control unit discharges a predetermined type of fragrance when the value indicated by the distance information is less than a predetermined threshold value.
(16)
The control unit discharges a predetermined type of fragrance when at least one of the matched users feels uncomfortable based on emotional data regarding the emotions of the matched user with respect to the other party (12) to (15). The information processing device according to any one.
(17)
At the time of matching, the control unit determines the user with the highest degree of matching as the matching partner based on the compatibility and the distance between the matching users at that time. Information processing device described in Crab.
(18)
The information processing device according to any one of (1) to (11) above, which is configured as a wearable terminal having a device for detecting the sensor data.
(19)
The information processing device according to any one of (12) to (17) above, which is configured as a wearable terminal having a device for detecting the sensor data and a server connected via a network.
(20)
Information processing device
An information processing method that uses a judgment model generated by machine learning using personal data for each user to control the discharge of fragrance to support the user with respect to a target estimated from input sensor data.
 1 芳香支援システム, 10,10-1乃至10-N 端末装置, 20 サーバ, 50 ネットワーク, 100 制御部, 101 CPU, 102 ROM, 103 RAM, 105 入力部, 106 出力部, 107 記憶部, 108 通信部, 109 近距離無線通信部, 111 操作部, 112 カメラ部, 113 センサ部, 121 表示部, 122 音出力部, 123 香料吐出部, 151 センサデータ取得部, 152 判定モデル適用部, 153 感情データ取得部, 154 状況分析部, 155 香料吐出制御部, 200 制御部, 201 CPU, 202 ROM, 203 RAM, 205 入力部, 206 出力部, 207 記憶部, 208 通信部, 209 ドライブ, 251 学習データ取得部, 252 学習用DB, 253 学習部, 254 提供部, 261 ターゲットデータ取得部, 262 マッチング用DB, 263 マッチング処理部, 264 マッチング結果提供部 1 Fragrance support system, 10, 10-1 to 10-N terminal device, 20 servers, 50 networks, 100 control units, 101 CPU, 102 ROM, 103 RAM, 105 input unit, 106 output unit, 107 storage unit, 108 communication Unit, 109 short-range wireless communication unit, 111 operation unit, 112 camera unit, 113 sensor unit, 121 display unit, 122 sound output unit, 123 fragrance discharge unit, 151 sensor data acquisition unit, 152 judgment model application unit, 153 emotion data Acquisition unit, 154 situation analysis unit, 155 fragrance discharge control unit, 200 control unit, 201 CPU, 202 ROM, 203 RAM, 205 input unit, 206 output unit, 207 storage unit, 208 communication unit, 209 drive, 251 learning data acquisition Department, 252 learning DB, 253 learning department, 254 providing department, 261 target data acquisition department, 262 matching DB, 263 matching processing department, 264 matching result providing department

Claims (20)

  1.  ユーザごとのパーソナルデータを用いた機械学習により生成した判定モデルを用い、入力されたセンサデータから推定されるターゲットに対する前記ユーザを支援するための香料の吐出を制御する制御部を備える
     情報処理装置。
    An information processing device including a control unit that controls the discharge of a fragrance to support the user with respect to a target estimated from input sensor data using a judgment model generated by machine learning using personal data for each user.
  2.  前記パーソナルデータは、前記ユーザについての他の人との相性に関するデータを含み、
     前記センサデータは、前記ユーザの周囲の撮影画像データを含み、
     前記制御部は、前記判定モデルを用い、入力された前記撮影画像データから推定される、前記相性に応じた周囲の人をターゲットとして行動を行う前記ユーザに対して、前記香料の吐出を制御する
     請求項1に記載の情報処理装置。
    The personal data includes data on the compatibility of the user with others.
    The sensor data includes captured image data around the user.
    Using the determination model, the control unit controls the discharge of the fragrance to the user who acts by targeting the surrounding people according to the compatibility, which is estimated from the input captured image data. The information processing device according to claim 1.
  3.  前記制御部は、前記センサデータから得られる、前記ユーザに対する前記ターゲットの行動に関する行動情報に基づいて、前記香料の吐出を制御する
     請求項2に記載の情報処理装置。
    The information processing device according to claim 2, wherein the control unit controls the discharge of the fragrance based on the behavior information regarding the behavior of the target with respect to the user obtained from the sensor data.
  4.  前記制御部は、前記ターゲットが前記ユーザの方向を見ているか否かに応じて、前記香料の種類を変更する
     請求項3に記載の情報処理装置。
    The information processing device according to claim 3, wherein the control unit changes the type of the fragrance depending on whether or not the target is looking in the direction of the user.
  5.  前記制御部は、前記ターゲットが前記ユーザの方向を見ていない場合、前記ターゲットから注目されるように、前記ターゲットから見たときの前記ユーザの外見を変更する
     請求項4に記載の情報処理装置。
    The information processing apparatus according to claim 4, wherein the control unit changes the appearance of the user when viewed from the target so that the target is noticed by the target when the target is not looking in the direction of the user. ..
  6.  前記制御部は、前記センサデータから得られる、前記ユーザと前記ターゲットとの間の距離に関する距離情報に基づいて、前記香料の吐出を制御する
     請求項2に記載の情報処理装置。
    The information processing device according to claim 2, wherein the control unit controls the discharge of the fragrance based on the distance information regarding the distance between the user and the target obtained from the sensor data.
  7.  前記制御部は、前記距離情報の示す値が、所定の閾値未満となる場合、前記香料の吐出量を増加させる
     請求項6に記載の情報処理装置。
    The information processing device according to claim 6, wherein the control unit increases the discharge amount of the fragrance when the value indicated by the distance information is less than a predetermined threshold value.
  8.  前記制御部は、前記ターゲットである周囲の人による前記ユーザに対する感情に関する感情データに基づいて、前記香料の吐出を制御する
     請求項2に記載の情報処理装置。
    The information processing device according to claim 2, wherein the control unit controls the discharge of the fragrance based on emotional data regarding emotions toward the user by a person around the target.
  9.  前記制御部は、前記ターゲットである周囲の人についての前記ユーザとの相性に応じて、前記香料の種類を変更する
     請求項8に記載の情報処理装置。
    The information processing device according to claim 8, wherein the control unit changes the type of the fragrance according to the compatibility of the target surrounding person with the user.
  10.  前記制御部は、
      前記感情データをフィードバックし、
      フィードバックした前記感情データに応じてカスタマイズされた前記香料の吐出を制御する
     請求項8に記載の情報処理装置。
    The control unit
    Feed back the emotional data
    The information processing device according to claim 8, which controls the discharge of the fragrance customized according to the fed-back emotion data.
  11.  前記パーソナルデータは、前記ターゲットを推定するに際しての前記ユーザの希望条件を含む
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the personal data includes a desired condition of the user when estimating the target.
  12.  前記制御部は、
      各ユーザの端末から送信されてくる前記ターゲットに関するターゲットデータを取得し、
      ユーザごとの前記ターゲットデータに基づいて、各ユーザの前記相性に応じたマッチングを行う
     請求項2に記載の情報処理装置。
    The control unit
    Acquire the target data related to the target transmitted from each user's terminal, and
    The information processing device according to claim 2, wherein matching is performed according to the compatibility of each user based on the target data for each user.
  13.  前記制御部は、マッチングされたユーザの端末の前記センサデータから得られる、マッチングされたユーザ同士の距離に関する距離情報に基づいて、前記香料の吐出を制御する
     請求項12に記載の情報処理装置。
    The information processing device according to claim 12, wherein the control unit controls the discharge of the fragrance based on the distance information regarding the distance between the matched users obtained from the sensor data of the terminal of the matched user.
  14.  前記制御部は、マッチングされたユーザ同士の距離が近くなるにつれて、所定の種類の香料の吐出量を増加させる
     請求項13に記載の情報処理装置。
    The information processing device according to claim 13, wherein the control unit increases the discharge amount of a predetermined type of fragrance as the distance between the matched users becomes closer.
  15.  前記制御部は、前記距離情報の示す値が、所定の閾値未満となる場合、所定の種類の香料を吐出させる
     請求項13に記載の情報処理装置。
    The information processing device according to claim 13, wherein the control unit discharges a predetermined type of fragrance when the value indicated by the distance information is less than a predetermined threshold value.
  16.  前記制御部は、マッチングされたユーザの相手に対する感情に関する感情データに基づいて、マッチングされたユーザの少なくとも一方に違和感がある場合、所定の種類の香料を吐出させる
     請求項12に記載の情報処理装置。
    The information processing device according to claim 12, wherein the control unit discharges a predetermined type of fragrance when at least one of the matched users feels uncomfortable based on emotional data regarding the emotions of the matched user. ..
  17.  前記制御部は、マッチングに際して、前記相性とともに、マッチングするユーザの間のその時点での距離に基づき、最もマッチ度の高いユーザを、マッチングの相手に決定する
     請求項12に記載の情報処理装置。
    The information processing device according to claim 12, wherein the control unit determines the user with the highest degree of matching as the matching partner based on the compatibility and the distance between the matching users at that time.
  18.  前記センサデータを検出するデバイスを有するウェアラブル端末として構成される
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, which is configured as a wearable terminal having a device for detecting the sensor data.
  19.  前記センサデータを検出するデバイスを有するウェアラブル端末と、ネットワークを介して接続されるサーバとして構成される
     請求項12に記載の情報処理装置。
    The information processing device according to claim 12, wherein a wearable terminal having a device for detecting the sensor data and a server connected via a network are configured.
  20.  情報処理装置が、
     ユーザごとのパーソナルデータを用いた機械学習により生成した判定モデルを用い、入力されたセンサデータから推定されるターゲットに対する前記ユーザを支援するための香料の吐出を制御する
     情報処理方法。
    Information processing device
    An information processing method that uses a judgment model generated by machine learning using personal data for each user to control the discharge of fragrance to support the user with respect to a target estimated from input sensor data.
PCT/JP2020/041733 2019-11-22 2020-11-09 Information processing device and information processing method WO2021100515A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021558299A JPWO2021100515A1 (en) 2019-11-22 2020-11-09

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-211518 2019-11-22
JP2019211518 2019-11-22

Publications (1)

Publication Number Publication Date
WO2021100515A1 true WO2021100515A1 (en) 2021-05-27

Family

ID=75981213

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/041733 WO2021100515A1 (en) 2019-11-22 2020-11-09 Information processing device and information processing method

Country Status (2)

Country Link
JP (1) JPWO2021100515A1 (en)
WO (1) WO2021100515A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003076874A (en) * 2001-09-03 2003-03-14 Nec Corp Aroma distribution system, apparatus and program thereof
US20060074742A1 (en) * 2004-09-27 2006-04-06 Carmine Santandrea Scent delivery devices and methods
JP2009151616A (en) * 2007-12-21 2009-07-09 Konica Minolta Holdings Inc Event performance system
JP2011065504A (en) * 2009-09-18 2011-03-31 Tokyo Univ Of Science Preference prediction server for generating prediction model concerning preference relation of user and method therefor
WO2013008717A1 (en) * 2011-07-08 2013-01-17 株式会社バンダイナムコゲームス Game system, program, and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003076874A (en) * 2001-09-03 2003-03-14 Nec Corp Aroma distribution system, apparatus and program thereof
US20060074742A1 (en) * 2004-09-27 2006-04-06 Carmine Santandrea Scent delivery devices and methods
JP2009151616A (en) * 2007-12-21 2009-07-09 Konica Minolta Holdings Inc Event performance system
JP2011065504A (en) * 2009-09-18 2011-03-31 Tokyo Univ Of Science Preference prediction server for generating prediction model concerning preference relation of user and method therefor
WO2013008717A1 (en) * 2011-07-08 2013-01-17 株式会社バンダイナムコゲームス Game system, program, and storage medium

Also Published As

Publication number Publication date
JPWO2021100515A1 (en) 2021-05-27

Similar Documents

Publication Publication Date Title
US11327556B2 (en) Information processing system, client terminal, information processing method, and recording medium
CN109475294B (en) Mobile and wearable video capture and feedback platform for treating mental disorders
CN112181152B (en) Advertisement pushing management method, device and application based on MR (magnetic resonance) glasses
US12074723B2 (en) Information processing system, information processing device, information processing method, and recording medium
US10571715B2 (en) Adaptive visual assistive device
KR101793426B1 (en) Virtual reality exposure theraphy apparatus
US20190188903A1 (en) Method and apparatus for providing virtual companion to a user
US20180032126A1 (en) Method and system for measuring emotional state
JP2017529521A (en) Wearable earpieces that provide social and environmental awareness
JP7036149B2 (en) Information processing system and control method
CN104699746A (en) Context aware, proactive digital assistant
US20230259793A1 (en) Earpiece advisor
JP2005315802A (en) User support device
CN109765991A (en) Social interaction system is used to help system and non-transitory computer-readable storage media that user carries out social interaction
CN113168526B (en) System and method for virtual and augmented reality
WO2018235379A1 (en) Service information provision system and control method
US10909405B1 (en) Virtual interest segmentation
CN108335734A (en) Clinical image recording method, device and computer readable storage medium
KR20200092207A (en) Electronic device and method for providing graphic object corresponding to emotion information thereof
CN105653020A (en) Time traveling method and apparatus and glasses or helmet using same
US20190193280A1 (en) Method for personalized social robot interaction
CN111611812B (en) Translation to Braille
WO2021100515A1 (en) Information processing device and information processing method
WO2020175969A1 (en) Emotion recognition apparatus and emotion recognition method
Makhataeva et al. Augmented Reality for Cognitive Impairments

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20890241

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021558299

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20890241

Country of ref document: EP

Kind code of ref document: A1