WO2023053278A1 - 提供システム、提供方法、及びコンピュータプログラム - Google Patents
提供システム、提供方法、及びコンピュータプログラム Download PDFInfo
- Publication number
- WO2023053278A1 WO2023053278A1 PCT/JP2021/035898 JP2021035898W WO2023053278A1 WO 2023053278 A1 WO2023053278 A1 WO 2023053278A1 JP 2021035898 W JP2021035898 W JP 2021035898W WO 2023053278 A1 WO2023053278 A1 WO 2023053278A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- user
- unit
- physical quantity
- contact
- Prior art date
Links
- 238000004590 computer program Methods 0.000 title claims description 13
- 238000000034 method Methods 0.000 title description 17
- 230000008859 change Effects 0.000 claims abstract description 22
- 238000012545 processing Methods 0.000 description 70
- 238000004891 communication Methods 0.000 description 59
- 230000008451 emotion Effects 0.000 description 57
- 238000010586 diagram Methods 0.000 description 43
- 238000005259 measurement Methods 0.000 description 23
- 239000010410 layer Substances 0.000 description 14
- 230000006870 function Effects 0.000 description 10
- 230000008569 process Effects 0.000 description 10
- 239000000853 adhesive Substances 0.000 description 7
- 230000001070 adhesive effect Effects 0.000 description 7
- 239000012790 adhesive layer Substances 0.000 description 7
- DNTFEAHNXKUSKQ-RFZPGFLSSA-N (1r,2r)-2-aminocyclopentane-1-sulfonic acid Chemical compound N[C@@H]1CCC[C@H]1S(O)(=O)=O DNTFEAHNXKUSKQ-RFZPGFLSSA-N 0.000 description 5
- 239000002551 biofuel Substances 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000003183 myoelectrical effect Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 210000004243 sweat Anatomy 0.000 description 4
- 230000002123 temporal effect Effects 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 238000010248 power generation Methods 0.000 description 3
- 239000011435 rock Substances 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000005611 electricity Effects 0.000 description 2
- 230000002996 emotional effect Effects 0.000 description 2
- 230000008921 facial expression Effects 0.000 description 2
- JVTAAEKCZFNVCJ-UHFFFAOYSA-N lactic acid Chemical compound CC(O)C(O)=O JVTAAEKCZFNVCJ-UHFFFAOYSA-N 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 101150024496 APX3 gene Proteins 0.000 description 1
- 102000004190 Enzymes Human genes 0.000 description 1
- 108090000790 Enzymes Proteins 0.000 description 1
- 230000005678 Seebeck effect Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000007794 irritation Effects 0.000 description 1
- 235000014655 lactic acid Nutrition 0.000 description 1
- 239000004310 lactic acid Substances 0.000 description 1
- 238000012417 linear regression Methods 0.000 description 1
- 238000007477 logistic regression Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
Definitions
- the present disclosure relates to a provision system, a provision method, and a computer program.
- Non-Patent Document 1 Conventionally, at live venues such as concerts, a system has been developed that analyzes the emotional transitions of event participants from images captured by cameras of the facial expressions of event participants (see, for example, Non-Patent Document 1).
- Patent Document 1 An information processing system that analyzes the user's emotions using a wearable terminal worn by the user has been proposed (see Patent Document 1, for example).
- a provision system is a sensor that has a contact surface and that measures a physical quantity indicating the degree of change in the contact surface of the contact surface is in contact with the user's body surface.
- a presentation unit that presents to the user that is a condition for permission to provide the physical quantity to someone other than the user; and a state of contact of the sensor with the body surface based on the physical quantity.
- a providing unit that starts providing the physical quantity to the other person based on the estimation result of the contact state by the estimating unit.
- a provision method is a sensor that has a contact surface and that measures a physical quantity indicating the degree of change in the contact surface of the contact surface is in contact with the user's body surface a step of presenting to the user that this is a condition for permission to provide the physical quantity to someone other than the user; and a contact state of the sensor with the body surface based on the physical quantity. and starting to provide the physical quantity to the other person based on the estimation result of the contact state.
- a computer program provides a computer with a sensor having a contact surface, the sensor measuring a physical quantity indicating the degree of change in the contact surface of the contact surface contacting the user's body surface.
- a subsection that presents to the user that the presence of the physical quantity is a condition for permission to provide the physical quantity to another person other than the user; and an providing unit that starts providing the physical quantity to the other person based on the estimation result of the contact state by the estimating unit.
- FIG. 1 is a diagram showing an example of the overall configuration of an emotion analysis system according to an embodiment.
- FIG. 2 is a block diagram showing an example of each configuration of a sensor and a server that configure the emotion analysis system according to the embodiment.
- FIG. 3 is a diagram showing an example of a sensor configuration.
- FIG. 4 is a diagram showing an example of arrangement positions of strain gauges on a circuit layer.
- FIG. 5 is a diagram showing an example of a release sheet that constitutes the sensor.
- FIG. 6 is a block diagram illustrating an example of the configuration of the smartphone according to the embodiment;
- FIG. 7 is a sequence diagram showing an example of user registration processing for registering user information of a user using a sensor in a server.
- FIG. 8 is a diagram showing an example of a user information registration screen.
- FIG. 9 is a diagram showing an example of user information registered in the server.
- FIG. 10 is a sequence diagram illustrating an example of event registration processing.
- FIG. 11 is a diagram showing an example of the event information screen.
- FIG. 12 is a diagram showing an example of the sticker information screen.
- FIG. 13 is a diagram showing an example of the terms of use information screen.
- FIG. 14 is a sequence diagram illustrating an example of linking processing.
- FIG. 15 is a diagram showing an example of a QR code (registered trademark) reading screen.
- FIG. 16 is a diagram showing an example of a user ID input screen.
- FIG. 17 is a diagram illustrating an example of user information in which a sensor ID is additionally registered.
- FIG. 18 is a sequence diagram illustrating an example of strain amount measurement processing.
- FIG. 19 is a diagram showing an example of temporal changes in strain amount.
- FIG. 20 is a diagram showing an example of temporal change in strain amount.
- FIG. 21 is a block diagram
- Non-Patent Document 1 acquisition and analysis of data such as images related to user's emotions and dynamics may be performed without the user's permission. Therefore, some users may feel uncomfortable. For this reason, some users may have a sense of resistance, such as a sense of rejection or fear that their emotions will be read.
- the user must wear a wearable terminal such as a smart watch, smart band, or smart eye glasses. Wearing such a wearable terminal feels uncomfortable for the user, and may cause a sense of resistance such as refusal or fear that the user's emotions will be read.
- a wearable terminal such as a smart watch, smart band, or smart eye glasses. Wearing such a wearable terminal feels uncomfortable for the user, and may cause a sense of resistance such as refusal or fear that the user's emotions will be read.
- the present disclosure has been made in view of such circumstances, and aims to provide a provision system, a provision method, and a computer program for analyzing emotions without giving the user a feeling of discomfort or resistance. .
- a provision system is a sensor having a contact surface that measures a physical quantity indicating the degree of change in the contact surface of the contact surface.
- a presenting unit for presenting to the user that the physical quantity is a condition for permission to provide the physical quantity to another person other than the user; and an providing unit that starts providing the physical quantity to the other person based on the estimation result of the contact state by the estimating unit.
- the user may contact the sensor with the body surface such as the face.
- the sensor with the body surface such as the face.
- the user can bring the sensor into contact with the body surface as if a sticker were put on the face.
- users are less reluctant to put stickers on their faces at sporting event venues, attraction venues, and the like. Therefore, others can collect physical quantities and analyze the user's emotions without giving the user a sense of discomfort or resistance.
- the estimating unit estimates that the sensor is in contact with the body surface based on a comparison result between the physical quantity and the first threshold, and the providing unit causes the estimating unit to
- the provision of the physical quantity to the other person may be started when it is estimated that the body surface has been touched.
- the change in the contact surface is small and the physical quantity is also small. Therefore, for example, when the physical quantity is equal to or greater than the first threshold, it can be estimated that the sensor is in contact with the body surface, and provision of the physical quantity to the other person can be started. As a result, it can be estimated that the sensor is in contact with the body surface, and it is possible to prevent the physical quantity from being provided to others before the sensor contacts. Therefore, others can accurately analyze emotions.
- the estimation unit may estimate that the sensor is in contact with the body surface based on the duration of the comparison result between the physical quantity and the first threshold.
- the seal is temporarily deformed due to external pressure before contacting the body surface, the physical quantity will temporarily increase, but the large physical quantity will not last. Therefore, for example, it can be estimated that the sensor is in contact with the body surface when the physical quantity remains equal to or greater than the first threshold for a predetermined period of time.
- the estimating unit estimates that the sensor is not in contact with the body surface based on a comparison result between the physical quantity and a second threshold, and the providing unit causes the estimating unit to The provision of the physical quantity to the other person may be terminated when it is estimated that the person is not in contact with the body surface.
- the change in the contact surface is small and the physical quantity is also small. Therefore, for example, when the physical quantity is less than the second threshold, it can be assumed that the sensor is not in contact with the body surface, and the provision of the physical quantity to the other person can be terminated. Thereby, it can be estimated that the sensor is not in contact with the body surface, and it is possible to prevent the physical quantity from being provided to others even though the sensor is not in contact with the body surface. Therefore, others can accurately analyze emotions.
- the estimation unit may estimate that the sensor is not in contact with the body surface based on the duration of the comparison result between the physical quantity and the second threshold.
- the sensor Even if the sensor is in contact with the body surface, if the user becomes expressionless temporarily, the physical quantity may temporarily decrease, but the body surface is vibrating finely. For this reason, a constant change occurs even when the face is expressionless, and the state in which the physical quantity is small does not last. Therefore, for example, it can be estimated that the sensor is not in contact with the body surface when the state in which the physical quantity is less than the second threshold continues for a predetermined period of time.
- the provision system further includes a registration unit that registers a sensor identifier that identifies the sensor, and a physical quantity reception unit that receives a set of the physical quantity and the sensor identifier of the sensor from the sensor,
- the providing unit may provide the other person with a set including the sensor identifier registered by the registration unit, among the sets received by the physical quantity receiving unit.
- the providing system can select a set including the registered sensor identifier from among the sets of physical quantities and sensor identifiers received from a plurality of sensors and provide the set to others. For example, by registering the sensor identifiers of the sensors used by the user in advance, the user's physical quantity can be efficiently provided to others.
- the provision unit further does not transmit, from among the pairs received by the physical quantity reception unit, the pairs that do not include the sensor identifier registered by the registration unit to the device used by the other person. good.
- pairs containing sensor identifiers that are not registered in the provision system are not provided to others. For example, if a provision system for registering a sensor identifier is uniquely set, a set of the sensor identifier and the physical quantity will be provided to others only from one provision system. Therefore, it is possible to prevent duplicate physical quantities from being provided to others.
- the registration unit may further provide a user identifier for identifying the user and the sensor identifier to the device used by the other person.
- the device used by the other person can identify the user from the combination of the physical quantity and the sensor identifier provided by the providing system, and can analyze the emotion of each user.
- the user identifier is not included in the set provided by the providing system to others. Therefore, even if a third party intercepts the pair, the third party cannot identify the user who measured the physical quantity. Therefore, the user's privacy is protected.
- the providing unit may further provide the measured position information of the physical quantity to a device used by the other person.
- the other person who receives the measurement position information of the physical quantity can analyze the emotion for each measurement position.
- the provision system further includes a setting unit that sets whether or not the provision of the physical quantity is permitted by the user, and the provision unit, when the provision of the physical quantity is not permitted by the user, provides the physical quantity.
- the provision to the other person may be discontinued.
- a provision method is provided by a sensor having a contact surface that measures a physical quantity indicating the degree of change in the surface that the contact surface contacts with the user's body surface. a step of presenting to the user that the physical quantity is a condition for permission to provide the physical quantity to another person other than the user; and starting to provide the physical quantity to the other person based on the estimation result of the contact state.
- This configuration includes steps corresponding to the configuration provided by the above-described provision system. Therefore, it is possible to obtain the same actions and effects as those of the above-described providing system.
- a computer program causes a computer to detect that a sensor having a contact surface that measures a physical quantity indicating the degree of change in a surface contacted by the contact surface is a user's body.
- a presenting unit for presenting to the user that being in contact with the surface is a condition for permission to provide the physical quantity to someone other than the user; It functions as an estimating unit that estimates a state of contact with the body surface, and a providing unit that starts providing the physical quantity to the other person based on the estimation result of the contact state by the estimating unit.
- the computer can function as the above-described provision system. Therefore, it is possible to obtain the same actions and effects as those of the above-described providing system.
- FIG. 1 is a diagram showing an example of the overall configuration of an emotion analysis system according to an embodiment.
- the emotion analysis system 1 includes a sensor 3, a server 4, and a smartphone 5.
- the sensor 3 has an adhesive surface and is shaped like a sheet.
- the sensor 3 is attached to the face of the user 2 (human) with an adhesive surface, for example, and measures a physical quantity that indicates the degree of change in the shape of the face according to the change in facial expression.
- the adhesive surface of the sensor 3 is an example of the contact surface of the sensor 3 with the face.
- the sensor 3 can measure the physical quantity by contacting the face with a fixture or the like without adhering the sensor 3 to the face.
- the sensor 3 may be used by attaching it to the body surface of the user 2 other than the face.
- the sheet-like shape of the sensor 3 includes not only a shape without unevenness on the surface, but also a three-dimensional shape with some unevenness on the surface of the sensor 3 from the viewpoint of fashionability, functionality, and the like.
- the surface of the sensor 3 may be decorated with unevenness, or the surface of the sensor 3 may be uneven.
- the contact surface of the sensor 3 may be uneven to soften the contact with the skin.
- FIG. 1 shows only the sensor 3 attached to one user 2, it is assumed that a sensor 3 is attached to each of a plurality of users 2.
- the sensor 3 is capable of short-range wireless communication with the smartphone 5.
- the sensor 3 performs BLE (Bluetooth (registered trademark) Low Energy) communication with the smartphone 5 .
- the sensor 3 has a function of a BLE beacon, and outputs, for example, radio waves in the 2.4 GHz band to transmit the measured physical quantity and the sensor ID (sensor identifier) for identifying itself from other sensors 3 in real time. broadcast with. Since BLE communication uses adaptive frequency hopping, interference between radio waves transmitted by the sensor 3 and radio waves transmitted by other sensors 3 can be reduced.
- the smartphone 5 is a terminal device owned by the user 2, and is capable of short-range wireless communication with the sensor 3.
- the smartphone 5 has a receiver function that performs BLE communication with the sensor 3 and receives physical quantities and sensor IDs transmitted from the sensor 3 .
- the smartphone 5 is connected to a network 6 such as the Internet via a wireless base station 7 .
- Wireless communication is performed between the smartphone 5 and the wireless base station 7 in compliance with a wireless communication standard such as 5G (fifth generation mobile communication system) or Wi-Fi (registered trademark).
- the smartphone 5 extracts group data including a pre-registered sensor ID from the group data of the physical quantity and the sensor ID from the sensor 3, and adds the position information of the smartphone 5 to the extracted group data. It transmits to the server 4 via the network 6 in real time. Since the user 2 possesses the smart phone 5 , the server 4 can regard the position of the smart phone 5 as the physical quantity measurement position of the user 2 .
- the user 2 can also use a terminal device capable of wireless communication, such as a mobile phone or a notebook computer, instead of the smartphone 5.
- a terminal device capable of wireless communication such as a mobile phone or a notebook computer, instead of the smartphone 5.
- the server 4 is a device used by someone other than the user 2.
- the other person is, for example, a business operator that analyzes the emotions of the user 2 using data collected from the user 2 .
- the server 4 functions as an emotion analysis device, is wired or wirelessly connected to the network 6 , and receives group data including physical quantities, sensor IDs, and position information from each of the plurality of smartphones 5 via the network 6 . Since the sensor 3 and the smartphone 5 transmit the set data in real time, the server 4 can estimate the time when the set data is received as the measurement time of the physical quantity.
- User information including personal information such as the sex and date of birth of the user 2 and the sensor ID of the sensor 3 used by the user 2 is pre-registered in the server 4 for each user 2 . Therefore, the server 4 can identify the user 2 who uses the received grouped data based on the received grouped data and the user information, and for each user 2, based on the physical quantity included in the grouped data. Analyze 2 emotions.
- the eventer when entering an event venue such as a sports viewing venue or an attraction venue, the eventer distributes the sensor 3 to the user 2 who is a participant.
- the user 2 who has received the sensor 3 puts the sensor 3 on his/her face (for example, the cheek portion) to bring the sensor 3 into contact with the face.
- characters or images for example, the team to support
- the like be printed on the surface of the sensor 3 that is visible to others. Thereby, the user 2 can attach the sensor 3 to the face as if attaching a face sticker.
- the server 4 can analyze the emotions of the user 2 for each position or area based on the position information included in the grouped data received from the smartphone 5. For example, the server 4 analyzes the user's emotion for each attraction (eg, roller coaster, merry-go-round) such as an attraction venue (eg, theme park). This allows the server 4 to analyze popular attractions, for example.
- an attraction eg, roller coaster, merry-go-round
- FIG. 2 is a block diagram showing an example of each configuration of the sensor 3 and the server 4 that constitute the emotion analysis system 1 according to the embodiment.
- the sensor 3 includes a power supply unit 31, a sensor unit 32, a wireless communication unit 34, and a storage unit 39.
- the power supply unit 31 supplies power to the sensor unit 32, the wireless communication unit 34, and the storage unit 39 to drive each processing unit.
- the power supply unit 31 is, for example, a sheet-type battery or a patch-type battery (see Non-Patent Document 2, for example). In FIG. 2, power supply lines are indicated by dashed lines.
- the power supply unit 31 may generate power using biological energy and supply the generated power.
- the power supply unit 31 performs temperature difference power generation using thermal energy of the user 2 to which the sensor 3 is attached, for example. More specifically, the power supply unit 31 includes a sheet-like Seebeck element (thermoelectric element) that is in contact with the skin of the user 2, and generates power using the Seebeck effect that generates an electromotive force due to the temperature difference inside the Seebeck element. .
- the power supply unit 31 may generate electricity using the sweat of the living body. More specifically, the power supply unit 31 includes a sheet-like biofuel cell that is in contact with the skin of the user 2, and the biofuel cell converts sweat into electric current using an enzyme that oxidizes lactic acid contained in human sweat. Power is generated by conversion.
- the power supply unit 31 may generate power using electromagnetic waves. More specifically, the power supply unit 31 may recover energy from electromagnetic waves existing around the sensor 3 and generate electricity by itself (see Non-Patent Document 3, for example).
- the power supply unit 31 is not limited to the one described above.
- the storage unit 39 is a storage device for storing the sensor ID of the sensor 3.
- the sensor unit 32 measures a physical quantity that indicates the degree of change in the shape of the surface (the face of the user 2) with which the adhesive surface of the sensor unit 32 contacts.
- the sensor section 32 includes, for example, strain gauges.
- the strain gauge measures the force applied to the strain gauge by the movement of the facial skin of the user 2 as a current (hereinafter referred to as "strain amount") (physical quantity).
- strain amount measures the force applied to the strain gauge by the movement of the facial skin of the user 2 as a current (hereinafter referred to as "strain amount") (physical quantity).
- the sensor unit 32 outputs the strain amount measured by the strain gauge to the wireless communication unit 34 .
- the wireless communication unit 34 has a communication interface that performs compact and power-saving wireless communication.
- the wireless communication unit 34 performs data communication according to, for example, the BLE communication standard as described above.
- the wireless communication unit 34 may perform data communication according to a communication standard such as Wi-SUN (Wireless Smart Utility Network) or ZigBee (registered trademark).
- Wi-SUN Wireless Smart Utility Network
- ZigBee registered trademark
- the wireless communication unit 34 transmits to the server 4 set data of the strain amount measured by the sensor unit 32 and the sensor ID stored in the storage unit 39 .
- FIG. 3 is a diagram showing an example of the configuration of the sensor 3.
- the sensor 3 is composed of three layers, for example, a sheet-like adhesive layer 3A, a sheet-like circuit layer 3B, and a sheet-like printing layer 3C.
- the adhesive layer 3A is coated with an adhesive for skin that causes less irritation to the skin below the adhesive layer 3A.
- a release sheet 3D is provided below the adhesive. After peeling off the release sheet 3D, the user 2 attaches the sensor 3 to the skin by bringing the lower part of the adhesive layer 3A into contact with the skin. Note that the user 2 can reattach the sensor 3 after removing it from the skin.
- a circuit layer 3B is arranged above the adhesive layer 3A.
- the circuit layer 3B includes a circuit that realizes the power supply unit 31, the sensor unit 32, the wireless communication unit 34, and the storage unit 39 described above.
- the circuit layer 3B is configured, for example, by arranging an IC (Integrated Circuit) chip on a flexible substrate.
- IC Integrated Circuit
- the printed layer 3C is arranged above the circuit layer 3B and has a printed surface.
- the printing surface is located above the printing layer 3C, and at least one of characters and images can be printed on the printing surface.
- Part or all of the printed surface may be plain such as white, or may be transparent or translucent so that the user 2 can write characters or images on the printed surface with a pen or the like. .
- FIG. 4 is a diagram showing an example of the arrangement positions of the strain gauges 38 on the circuit layer 3B.
- the strain gauge 38 forms part of the sensor section 32 .
- Strain gauges 38 are arranged, for example, at the four corners and the central portion of the circuit layer 3B. By arranging the strain gauges 38 at a plurality of locations in this manner, the sensor unit 32 can measure the degree of change in the shape of the face with higher accuracy than when the strain gauges 38 are arranged at one location. be able to.
- the strain amount measured by the sensor unit 32 is a representative value (for example, average value, maximum value, minimum value, median value, etc.) of the strain amounts measured by the plurality of strain gauges 38 .
- a set of strain amounts measured by a plurality of strain gauges 38 may be used as the strain amount measured by the sensor section 32 .
- FIG. 5 is a diagram showing an example of a release sheet 3D that constitutes the sensor 3.
- FIG. Terms of use 61 and a QR code (registered trademark) 62 are printed in advance on the lower surface of the release sheet 3D (that is, the surface that the user 2 can visually recognize without peeling off the release sheet 3D).
- the terms of use 61 indicate the terms of use for the user 2 to use the sensor 3.
- the terms of use 61 particularly indicate that attachment of the sensor 3 is a condition for permission to provide strain amounts to others other than the user 2 .
- the terms of use 61 indicate that the measurement of biometric information (strain amount) is to be started at the time the sticker (sensor 3) is attached, and that the biometric information is to be used for emotion analysis.
- the terms of use 61 may indicate that the biological information and the analyzed emotional information belong to our company (here, Kxx Co., Ltd. using the server 4) and are used for secondary purposes.
- the terms of use 61 include, for example, terminating the measurement of biological information when the sticker is peeled off from the body surface, setting the interruption of measurement with the application, and preventing incorrect use of the sticker. It states that you are exempt from responsibility, that you consent to all the matters described at the time the sticker is attached, and so on.
- the QR code 62 contains information in which the sensor ID of the sensor 3 is encoded.
- the smartphone 5 can obtain the sensor ID of the sensor 3 by the user 2 using the camera of the smartphone 5 to read the QR code 62 .
- the server 4 includes a communication section 41, a processing section 42, a storage section 43, and a timer section 44.
- the server 4 can be realized by a computer equipped with a CPU (Central Processing Unit), ROM (Read Only Memory), RAM (Random Access Memory), HDD (Hard Disk Drive), communication interface, and the like.
- the processing unit 42 is functionally realized by developing a computer program stored in a non-volatile memory such as a ROM or HDD into a RAM and executing it on the CPU.
- the communication unit 41 receives set data of strain amount, sensor ID, and position information from the smartphone 5 via the network 6 .
- the clock unit 44 is a clock or timer that measures time.
- the storage unit 43 includes, for each user 2, a user ID (user identifier) for identifying the user 2, personal information such as the gender and date of birth of the user 2, and the sensor ID of the sensor 3 used by the user 2. It is assumed that user information is stored.
- the processing unit 42 stores the grouped data received by the communication unit 41 in the storage unit 43, including the time measured by the clock unit 44.
- the time measured by the time measuring unit 44 is presumed to be the strain amount measurement time.
- the processing unit 42 reads out the group data and the user information from the storage unit 43 .
- the processing unit 42 identifies the user 2 who uses the grouped data based on the read grouped data and the user information. That is, the processing unit 42 identifies the sensor IDs included in the set data, and identifies the user IDs corresponding to the identified sensor IDs based on the user information.
- the processing unit 42 analyzes the emotion for each user ID (for each user 2) based on the strain amount included in the group data corresponding to the user ID.
- the emotion to be analyzed is, for example, user 2's joy.
- the processing unit 42 uses a learning model that has learned in advance the relationship between the time-series strain amount and the joy level that quantifies the degree of joy, and inputs the measurement time and the strain amount into the learning model. to decide.
- the learning model is, for example, a multilayer neural network, and the parameters of the neural network are machine-learned by deep learning using a learning set in which the measurement time and strain amount are input and the degree of pleasure is output.
- the learning model is not limited to neural networks, for example, linear regression model, logistic regression model, support vector machine, random forest, Ada Boost, Naive Bayes, k-nearest neighbor method, etc. Other discriminators may be used. can be done.
- the processing unit 42 may further create display data for displaying the analysis result of the degree of pleasure for each user 2 on the screen of the display device.
- the display data may, for example, show the distribution of the joy levels of a plurality of users 2 in a table format or a graph format, or may show the joy levels in a map format in association with the positions of the users 2. good.
- the processing unit 42 may analyze the emotions of the user 2 for each piece of personal information. For example, the processing unit 42 may analyze the emotions of the user 2 by gender, or may analyze the emotions of the user 2 by age.
- position information is included in the grouped data received by the communication unit 41 . Therefore, the processing unit 42 may analyze the emotion of the user 2 for each position or area.
- FIG. 6 is a block diagram showing an example of the configuration of the smartphone 5 according to the embodiment.
- the smartphone 5 includes a communication unit 51, a storage unit 52, a touch panel 53, a camera 54, a position detection unit 55, and a processing unit 56, which are interconnected via an internal bus 57.
- the communication unit 51 has a communication interface that performs compact and power-saving wireless communication.
- the wireless communication unit 34 receives group data broadcast from the sensor 3, for example, according to the BLE communication standard as described above.
- the communication unit 51 may perform data communication according to a communication standard such as Wi-SUN or ZigBee (registered trademark).
- the communication unit 51 is connected to the network 6 via the wireless base station 7 by communication according to wireless communication standards such as 5G or Wi-Fi (registered trademark).
- wireless communication standards such as 5G or Wi-Fi (registered trademark).
- the storage unit 52 stores the sensor ID of the sensor 3 attached to the user 2 who owns the smartphone 5. Note that there are also users 2 who do not possess smartphones 5 . Therefore, the storage unit 52 may store the sensor ID of the sensor 3 that is attached to the user 2 other than the user 2 who owns the smartphone 5 . As a result, for example, the storage unit 52 stores the sensor ID of the sensor 3 attached to each of the user 2 who owns the smartphone 5 and another user 2 who accompanies the user 2 .
- the storage unit 52 is composed of, for example, a non-volatile memory or a volatile memory.
- the touch panel 53 has functions as a display device (display panel) that displays various information to the user 2 and as an input device (touch sensor) that accepts input of various information from the user 2 .
- the camera 54 is used to read the QR code 62 printed on the release sheet 3D of the sensor 3 shown in FIG.
- the position detection unit 55 detects the position of the smartphone 5 using satellite navigation.
- the position detection unit 55 detects the position of the smartphone 5 based on radio waves received from multiple GPS (Global Positioning System) satellites.
- the position of the smartphone 5 can be specified by latitude and longitude, for example.
- Satellite navigation uses a satellite positioning system (GNSS: Global Navigation Satellite System) such as GPS, but is not limited to GPS.
- GNSS Global Navigation Satellite System
- the processing unit 56 is composed of a processor such as a CPU, for example.
- the processing unit 56 includes a presenting unit 56A, a strain amount receiving unit 56B, an estimating unit 56C, and a providing unit as functional processing units realized by executing a computer program stored in advance in the storage unit 52. 56D and a registration unit 56E.
- the presentation unit 56A causes the touch panel 53 to display the terms of use for using the sensor 3.
- the terms of use displayed on the touch panel 53 include the same content as the terms of use 61 printed on the release sheet 3D of the sensor 3 shown in FIG. In other words, the terms of use displayed on the touch panel 53 indicate that attachment of the sensor 3 is a condition for permission to provide the strain amount to others other than the user 2 .
- the strain amount receiving unit 56B receives set data of the strain amount and the sensor ID of the sensor 3 from the sensor 3 via the communication unit 51 .
- the estimating unit 56C estimates the contact state of the sensor 3 with the body surface of the user 2 based on the amount of strain included in the set data received by the strain amount receiving unit 56B. That is, the estimation unit 56C estimates that the sensor 3 is in contact with the body surface of the user 2 or that the sensor 3 is not in contact with the body surface of the user 2 based on the amount of strain.
- a contact state estimation method will be described later.
- the providing unit 56D starts providing the strain amount to the server 4 based on the estimation result of the contact state by the processing unit 56. That is, the providing unit 56D starts providing group data including the strain amount to the server 4 when the estimating unit 56C estimates that the sensor 3 is in contact with the body surface of the user 2 .
- the providing unit 56D adds the position information detected by the position detection unit 55 to the group data including the sensor ID stored in the storage unit 52, among the group data received by the strain amount receiving unit 56B. and provided to the server 4 via the communication unit 51 .
- the providing unit 56D discards group data not including the sensor ID stored in the storage unit 52 among the group data received by the strain amount receiving unit 56B, and does not provide the group data to the server 4.
- the provision unit 56D ends providing the strain amount to the server 4.
- the registration unit 56E registers the sensor ID of the sensor 3 attached by the user 2. Specifically, the registration unit 56E causes the storage unit 52 to store the sensor ID acquired by reading the QR code 62 using the camera 54 . Further, the registration unit 56 ⁇ /b>E pairs the acquired sensor ID and the user ID of the user 2 and transmits the pair to the server 4 . The server 4 receives the set, associates the sensor ID and the user ID, and registers them in the user information.
- the registration unit 56E executes registration processing for events in which the user 2 participates.
- FIG. 7 is a sequence diagram showing an example of user registration processing for registering the user information of the user 2 who uses the sensor 3 in the server 4.
- User 2 operates touch panel 53 of smartphone 5 to request server 4 to download an application for emotion analysis system 1 (hereinafter referred to as "emotion analysis application").
- emotion analysis application an application for emotion analysis system 1
- the user 2 accesses the web page of the server 4 from which applications can be downloaded, and selects an emotion analysis application from the downloadable applications.
- registration unit 56E transmits to server 4 a request signal requesting download of the emotion analysis application.
- the processing unit 42 of the server 4 receives the request signal via the communication unit 41 (step S1).
- the processing unit 42 of the server 4 reads the emotion analysis application from the storage unit 43, transmits the read emotion analysis application to the smartphone 5 via the communication unit 41, and the registration unit 56E receives the emotion analysis application (step S2). .
- Registration unit 56E installs the emotion analysis application on smartphone 5 .
- the server 4 is not the only server that provides the emotion analysis application, and a dedicated server other than the server 4 that provides the application may provide the emotion analysis application.
- the processing unit 56 activates the emotion analysis application in response to the tap (step S3).
- a user information registration screen is displayed (step S4).
- FIG. 8 is a diagram showing an example of a user information registration screen.
- the user information registration screen includes input fields for a user ID, password, gender, date of birth, name and address, an OK button 72 and a cancel button 73 .
- the user 2 operates the touch panel 53 to input user information in each input field, and taps the OK button 72 (step S5).
- the registration unit 56E transmits the input user information to the server 4 via the communication unit 51, and the processing unit 42 of the server 4 receives the user information from the smartphone 5 via the communication unit 41 (step S6).
- the processing unit 42 stores the received user information in the storage unit 43 (step S7).
- User 2 can also cancel the user information registration process by tapping the cancel button 73 . Also, the user 2 can register user information of a plurality of users 2 . For example, User 2 registers user information of other User 2 who accompanies User 2 .
- FIG. 9 is a diagram showing an example of user information registered in the server 4.
- User information 80 includes a user ID, password, gender, date of birth, name and address.
- user 2 with a user ID of "U001” has a password of "P0123", a gender of "male,” a date of birth of "October 1, 2000,” a name of "Taro Sumitomo,” and an address of "Minato, Tokyo.”
- Section XXX Section XXX.
- User 2 with a user ID of "U002” has a password of "APX3", a gender of "female,” a date of birth of "June 7, 2003,” a name of "Hanako Sumitomo,” and an address of "Minato, Tokyo.” Section XXX”.
- the user 2 with the user ID "U002” is the companion of the user 2 with the user ID "U001”.
- FIG. 10 is a sequence diagram showing an example of event registration processing.
- the processing unit 56 activates the emotion analysis application in response to the tap (step S11).
- the processing unit 56 transmits the inputted user ID and password to the server 4, and the processing unit 42 of the server 4 sends the smartphone 5 receives the user ID and password sent from (step S12).
- the processing unit 42 executes authentication processing to determine whether the received user ID and password are registered in the user information 80 (step S13).
- the processing unit 42 transmits authenticated information indicating that the user has been authenticated to the transmission source smartphone 5, and the processing unit 56 of the smartphone 5 receives the information (step S14 ).
- the registration unit 56E After authentication, when the user 2 selects an event information screen display menu on the application (step S15), the registration unit 56E transmits an event information request signal to the server 4, and the processing unit 42 responds to the request signal.
- the event information for which participation is solicited is read from the storage unit 43 and transmitted to the smartphone 5 .
- the registration unit 56E receives the event information (step S17).
- the registration unit 56E causes the touch panel 53 to display the event information screen based on the received event information (step S18).
- FIG. 11 is a diagram showing an example of the event information screen.
- the event information screen 90 displays two pieces of event information of events in which the user 2 can participate.
- the number of events is not limited to two.
- Each event information includes the event name, cost, location, date and time.
- the name of the first event is "Comedy Live”
- the cost is "5,000 yen for adults, 3,000 yen for junior high school students and younger”
- the location is "ZZZ City Civic Hall”
- the date and time is "December 1, 2021.” Doors open at 18:00, performance starts at 18:30.”
- the event name of the second event is "Rock Concert”
- the cost is "8,000 yen for adults, 5,000 yen for junior high school students and younger”
- the venue is "XYZ Concert Hall”
- the date and time is "December 2, 2021 18 : 30 doors open, 19:00 start”.
- Each event information is provided with an input field for the number of participants, and User 2 inputs the number of participants in the event he wishes to participate in, and taps the OK button 92 to register the participation event (step S19).
- User 2 inputs the number of participants in the event he wishes to participate in, and taps the OK button 92 to register the participation event (step S19).
- user 2 has entered “two adults” in the number of participants field for the event name “rock concert” and tapped the OK button 92 .
- the registration unit 56E transmits, to the server 4, participation event information including identification information (for example, event name) of the event that the user 2 wishes to participate in and the number of participants, and the processing unit 42 receives the participation event information (step S20).
- participation event information including identification information (for example, event name) of the event that the user 2 wishes to participate in and the number of participants
- the processing unit 42 executes registration processing for the event that the user 2 wishes to participate in based on the participation event information (step S21). For example, information indicating that two adults of user 2 will participate in a “rock concert” is registered in storage unit 43 .
- the processing unit 42 transmits to the smartphone 5 sticker information indicating the pattern of the sensor 3 that can be distributed to the user 2 who is planning to participate, and the registration unit 56E of the smartphone 5 The sticker information is received via the communication unit 51 (step S22).
- the registration unit 56E displays a sticker information screen on the touch panel 53 based on the received sticker information (step S23).
- FIG. 12 is a diagram showing an example of the sticker information screen.
- the sticker information screen 100 On the sticker information screen 100, three patterns of stickers 102A to 102C are displayed as patterns of the sensor 3 that can be distributed to the user 2.
- FIG. Quantity input fields 103A to 103C for the stickers 102A to 102C are also displayed.
- User 2 places an order for stickers by inputting the desired number of sensors 3 in number entry fields 103A to 103C and tapping OK button 104 (step S24). For example, if the user 2 wishes to have one sensor 3 for each of the sticker 102A and the sticker 102B, the user 2 enters "1" in each of the number entry fields 103A and 103B and taps the OK button 104.
- the registration unit 56E transmits information on the ordered sticker to the server 4, and the processing unit 42 of the server 4 receives the information via the communication unit 41 (step S25). For example, the identification information of the stickers 102B and 102C and the number of orders (one each) are sent to the server 4 .
- the processing unit 42 executes the sticker ordering process (step S26). For example, the processing unit 42 transmits information such as the name and address of the user 2, the design and number of the sensor 3 to be shipped, and the identification information of the event for which the user 2 has registered to participate, to the email address of the person in charge of sending the sensor 3. .
- the person in charge of sending sends the sensor 3 so that it reaches the user 2 before the date of the event.
- the person in charge of shipping sends the sensor 3 so that it reaches the event site before the date of the event.
- the processing unit 42 transmits the terms of use information of the sensor 3 to the smartphone 5, and the presentation unit 56A of the smartphone 5 receives the terms of use information via the communication unit 51 (step S27).
- the presentation unit 56A displays the terms of use information screen on the touch panel 53 based on the received terms of use information (step S28).
- FIG. 13 is a diagram showing an example of the terms of use information screen.
- the terms of use information screen 110 displays a terms of use 111 for the user 2 to use the sensor 3 and an OK button 112 for confirming that the user 2 has agreed to the terms of use 111 .
- the terms of use 111 include the same content as the terms of use 61 printed on the release sheet 3D of the sensor 3 shown in FIG.
- linking processing After receiving the sensor 3 , the user 2 executes linking processing for linking the sensor ID of the sensor 3 and the user ID of the user 2 who uses the sensor 3 .
- FIG. 14 is a sequence diagram showing an example of linking processing.
- the processing unit 56 activates the emotion analysis application in response to the tap (step S31).
- the processing unit 56 transmits the inputted user ID and password to the server 4, and the processing unit 42 of the server 4 sends the smartphone 5
- the user ID and password transmitted from are received via the communication unit 41 (step S32).
- the processing unit 42 executes authentication processing to determine whether the received user ID and password are registered in the user information 80 (step S33).
- the processing unit 42 transmits authenticated information indicating the authentication to the smartphone 5 as the transmission source, and the processing unit 56 of the smartphone 5 receives the information via the communication unit 51. is received (step S34).
- the registration unit 56E After authentication, when the user 2 selects the linking process menu on the application, the registration unit 56E causes the touch panel 53 to display a QR code reading screen for reading the QR code 62 printed on the sensor 3 (step S35).
- FIG. 15 is a diagram showing an example of a QR code (registered trademark) reading screen.
- An explanation 121 and an icon 122 for reading the QR code 62 are displayed on the QR code reading screen 120 .
- registration unit 56E activates camera 54 .
- the registration unit 56E decodes the photographed QR code 62 and acquires the sensor ID of the sensor 3. (step S36).
- the processing unit 56 After obtaining the sensor ID, the processing unit 56 causes the touch panel 53 to display a user ID input screen for inputting a user ID to be linked to the sensor ID (step S37).
- FIG. 16 is a diagram showing an example of a user ID input screen.
- the user ID input screen 130 includes a user ID input field 131 associated with the sensor ID acquired by the registration unit 56E, and an OK button 132 for confirming the input of the user ID.
- the registration unit 56E sets the acquired sensor ID and the entered user ID and transmits them to the server 4.
- the processing unit 42 of the server 4 receives the set of sensor ID and user ID via the communication unit 41 (step S39).
- the processing unit 42 additionally registers the sensor ID in the user information 80 based on the received set (step S41).
- FIG. 17 is a diagram showing an example of user information in which a sensor ID is additionally registered.
- a sensor ID column is added to the user information 80 .
- the processing unit 42 when receiving S123 and U001 as a set of the sensor ID and the user ID, the processing unit 42 additionally registers the sensor ID “S123” in association with the user ID “U001” of the user information 80 .
- the registration unit 56E stores at least the sensor ID in the storage unit 52, thereby registering it in the smartphone 5 (step S40).
- the sensor ID “S123” is registered in the smart phone 5 .
- step S35 By repeatedly executing the processes after step S35, it is possible to associate and register a plurality of sensor IDs and a plurality of user IDs.
- the sensor ID “S124” is also additionally registered in association with the user ID “U002” of the user information 80 .
- a sensor ID “S124” is also registered in the smartphone 5 .
- FIG. 18 is a sequence diagram showing an example of strain amount measurement processing.
- the power supply unit 31 of the sensor 3 starts to supply power to each processing unit. measures the strain amount of the strain gauge 38 (step S41).
- the wireless communication unit 34 of the sensor 3 transmits the set data including the measured strain amount and the sensor ID of the sensor 3 stored in the storage unit 39 to the smartphone 5, and the strain amount receiving unit 56B of the smartphone 5 The set data is received (step S42).
- the strain amount receiving unit 56B determines whether or not the sensor ID included in the received paired data has been registered in the storage unit 52, and here it is confirmed that it has been registered (step S43).
- the strain amount receiving unit 56B stores the grouped data including the registered sensor ID in the storage unit 52 in association with the reception time of the grouped data (step S44).
- the estimation unit 56C Based on the set data and the reception time stored in the storage unit 52, the estimation unit 56C, for each sensor ID, from the strain amount and the reception time associated with the sensor ID to the body surface of the user 2 of the sensor 3 Estimate the contact state of Here, it is assumed that the contact condition indicating that the sensor 3 is in contact with the body surface is satisfied (step S45).
- FIG. 19 is a diagram showing an example of temporal changes in strain amount.
- the horizontal axis indicates time (seconds), and the vertical axis indicates the amount of strain ( ⁇ ).
- the estimating unit 56C starts from the time when the strain amount becomes equal to or greater than a predetermined threshold TH1 (measurement time t1), and the time when the strain amount equals or exceeds the threshold TH1 exceeds the predetermined time T1 (measurement time t2 ), it is determined that the above contact conditions are satisfied.
- the estimating unit 56C may determine that the contact condition is satisfied when the strain amount becomes equal to or greater than a predetermined threshold TH1 (measurement time t1).
- the providing unit 56D adds the position information detected by the position detecting unit 55 to the set data of the sensor ID and strain amount received in step S42, and transmits the combined data to the server 4.
- the processing unit 42 of the server 4 receives the grouped data to which the position information is added via the communication unit 41 (step S46).
- the processing unit 42 adds the current time clocked by the clock unit 44 to the received paired data and saves it in the storage unit 43 (step S47). That is, in the storage unit 43, set data including the sensor ID, strain amount, position information, and time information are registered.
- steps S48 to S52 are executed. That is, similar to steps S41 to S44, the set data including the strain amount measured by the sensor 3 is stored in the storage unit 52 of the smartphone 5 (steps S48 to S51).
- the estimation unit 56C Based on the set data and the reception time stored in the storage unit 52, the estimation unit 56C, for each sensor ID, from the strain amount and the reception time associated with the sensor ID to the body surface of the user 2 of the sensor 3 Estimate the contact state of Here, it is assumed that the non-contact condition indicating that the sensor 3 is not in contact with the body surface is satisfied (step S52).
- FIG. 20 is a diagram showing an example of temporal changes in the amount of strain.
- the horizontal axis indicates time, and the vertical axis indicates strain amount.
- the estimating unit 56C starts from the time when the strain amount becomes less than the predetermined threshold TH2 (measurement time t3), and the time when the strain amount is less than the threshold TH2 exceeds the predetermined time T2 (measurement time t4 ) to satisfy the non-contact condition.
- the estimation unit 56C may determine that the non-contact condition is satisfied when the strain amount becomes less than the predetermined threshold TH2 (measurement time t3).
- the threshold TH2 may be the same as or different from the threshold TH1.
- the predetermined time T2 may be the same as or different from the predetermined time T1.
- the providing unit 56D does not transmit the set data of the sensor ID and strain amount received in step S49 to the server 4.
- the smartphone 5 receives the set data of the sensor ID and the strain amount measured by the sensor 3 from the sensor 3 (steps S53 to S54).
- the strain amount receiving unit 56B determines whether or not the sensor ID included in the received paired data has been registered in the storage unit 52, and here it is confirmed that it has not been registered (step S55).
- the strain amount receiving unit 56B discards the group data containing the unregistered sensor ID (step S56). Accordingly, the providing unit 56D does not transmit group data including unregistered sensor IDs to the server 4 .
- the server 4 can collect strain amounts and analyze the emotions of the user 2 without giving the user 2 a feeling of discomfort or resistance.
- the estimating unit 56C estimates that the sensor 3 is in contact with the body surface when the strain amount is equal to or greater than the threshold TH1
- the providing unit 56D estimates that the strain amount server 4 can start providing to Thereby, it can be estimated that the sensor 3 is in contact with the body surface, and it is possible to prevent the strain amount from being provided to the server 4 before the sensor 3 contacts. Therefore, the server 4 can accurately analyze emotions.
- the estimating unit 56C can estimate that the sensor 3 is in contact with the body surface when the state in which the strain amount is equal to or greater than the threshold TH1 continues for the predetermined time T1.
- the estimation unit 56C estimates that the sensor 3 is not in contact with the body surface when the strain amount is less than the threshold TH2, and the provision unit 56D determines that the sensor 3 is not in contact with the body surface.
- the provision of the strain amount to the server 4 can be terminated. As a result, it can be estimated that the sensor 3 is not in contact with the body surface, and the strain amount is prevented from being provided to the server 4 even though the sensor 3 is not in contact with the body surface. can be done. Therefore, the server 4 can accurately analyze emotions.
- the sensor 3 is in contact with the body surface, if the user becomes expressionless temporarily, the physical quantity may temporarily decrease, but the body surface will vibrate finely. ing. For this reason, a certain amount of distortion occurs even when the expression is neutral, and the state in which the amount of distortion is small does not last. Therefore, for example, it can be estimated that the sensor 3 is not in contact with the body surface when the state in which the strain amount is less than the threshold TH2 continues for a predetermined time T2.
- the registration unit 56E of the smartphone 5 registers the sensor ID of the sensor 3 used by the user 2 or his companion.
- the strain amount receiving unit 56B receives set data of strain amounts and sensor IDs from the plurality of sensors 3 .
- the providing unit 56D provides the server 4 with grouped data including the sensor ID registered by the registering unit 56E, among the grouped data received by the strain amount receiving unit 56B. Therefore, the strain amounts of the user 2 and the companion can be efficiently provided to the server 4 .
- the providing unit 56D does not provide the server 4 with grouped data including sensor IDs not registered in the smartphone 5, among the grouped data received by the strain amount receiving unit 56B.
- the smartphone 5 with which the sensor ID is registered is uniquely set, the paired data of the sensor ID and strain amount is provided to the server 4 only from one smartphone 5 . Therefore, it is possible to prevent duplicate strain amounts from being provided to the server 4 .
- User information 80 including a user ID and a sensor ID is also registered in the server 4 . Therefore, the server 4 can identify the user 2 from the set data of the strain amount and the sensor ID provided from the smartphone 5 and can analyze the emotion of each user 2 . Note that the user ID is not included in the set data provided from the smartphone 5 to the server 4 . Therefore, even if a third party intercepts the paired data, the third party cannot identify the user 2 who measured the strain amount. Therefore, user 2's privacy is protected.
- the provision unit 56D of the smartphone 5 transmits the set data including the measurement position information of the strain amount to the server 4. Therefore, the server 4 can analyze emotions for each measurement position.
- FIG. 21 is a block diagram showing an example of the configuration of the smartphone 5 according to the modification.
- the configuration of the smartphone 5 is the same as the configuration of the smartphone 5 according to the embodiment shown in FIG.
- the processing unit 56 further includes a setting unit 56F as a functional processing unit realized by executing a computer program stored in the storage unit 52 in advance.
- the setting unit 56F sets whether or not the user 2 allows the providing unit 56D to provide the strain amount to the server 4 for each sensor ID. For example, it is assumed that a button for setting permission or non-permission to provide strain amounts for each sensor ID is provided in the menu of the emotion analysis application. The user 2 taps the button to set permission or disapproval of the provision of the amount of strain, and the setting unit 56F notifies the provision unit 56D of the set result of permission or disapproval.
- the providing unit 56D includes the strain amount for the server 4 even when the sensor 3 is in contact with the body surface of the user 2 when the disapproval of the strain amount provision is set by the user 2. Discontinue provision of grouped data.
- the provision unit 56D sends a group including the strain amount to the server 4. Resume providing data.
- the strain amount measured by the sensor 3 is transmitted to the server 4 via the smartphone 5 in the above-described embodiment, it may be transmitted to the server 4 without via the smartphone 5 .
- an access point is provided for each area of the venue, the sensor 3 communicates with the access point, and transmits set data of the strain amount and the sensor ID to the access point.
- the access point transmits the grouped data received from the sensor 3 to the server 4 via the network 6 .
- User information 80 including sensor IDs and user IDs is registered in the server 4 . Therefore, the server 4 can analyze each user's emotion based on the set data including the sensor ID registered in the user information 80 . On the other hand, if the sensor ID included in the received grouped data is not registered in the user information 80, the grouped data cannot be associated with the user ID. Therefore, the server 4 may discard the grouped data, or may analyze the emotion of each sensor 3 based on the grouped data, although the user cannot be specified.
- the server 4 can know the access points through which the grouped data passes, it can analyze the emotions of the user 2 for each access point (that is, for each area where the access point is installed).
- the smartphone 5 does not transmit the time information to the server 4, but when the grouped data received from the sensor 3 is collectively transmitted to the server 4, the grouped data is received together with the grouped data. The time may be transmitted to the server 4.
- a part or all of the components constituting each of the above devices may be composed of one or more semiconductor devices such as system LSIs.
- a system LSI is an ultra-multifunctional LSI manufactured by integrating multiple components on a single chip. Specifically, it is a computer system that includes a microprocessor, ROM, RAM, etc. . A computer program is stored in the RAM. The system LSI achieves its functions by the microprocessor operating according to the computer program.
- the computer program described above may be recorded on a non-temporary computer-readable recording medium, such as an HDD, CD-ROM, semiconductor memory, etc., and distributed. Also, the computer program may be transmitted and distributed via an electric communication line, a wireless or wired communication line, a network represented by the Internet, data broadcasting, or the like.
- a non-temporary computer-readable recording medium such as an HDD, CD-ROM, semiconductor memory, etc.
- the computer program may be transmitted and distributed via an electric communication line, a wireless or wired communication line, a network represented by the Internet, data broadcasting, or the like.
- the server 4 may be realized by a plurality of processors or a plurality of computers.
- part or all of the functions of the server 4 may be provided by cloud computing. That is, part or all of the functions of the server 4 may be realized by the cloud server.
- the sensor unit 32 may be a myoelectric sensor.
- a myoelectric sensor is a sensor that measures a weak electric field change (potential difference) generated in a muscle.
- the myoelectric sensor outputs a current corresponding to the potential difference.
- the current value output from the myoelectric sensor is an example of a physical quantity that indicates the degree of change in the shape of the surface (the face of the user 2) with which the adhesive surface of the sensor section 32 contacts.
- 1 Emotion analysis system (providing system), 2 User, 3 Sensor, 3A Adhesive layer, 3B Circuit layer, 3C Print layer, 3D Release sheet, 4 Server, 5 Smartphone, 6 Network, 7 Wireless base station, 31 Power supply unit, 32 sensor unit, 34 wireless communication unit, 38 strain gauge, 39 storage unit, 41 communication unit, 42 processing unit, 43 storage unit, 44 timing unit, 51 communication unit, 52 storage unit, 53 touch panel, 54 camera, 55 position detection unit , 56 processing unit, 56A presentation unit, 56B strain amount reception unit (physical quantity reception unit), 56C estimation unit, 56D provision unit, 56E registration unit, 56F setting unit, 57 internal bus, 61 Terms of use, 62 QR code, 72 OK button, 73 Cancel button, 80 User information, 90 Event information screen, 92 OK button, 93 Cancel button, 100 Sticker information screen, 102A Sticker, 102B Sticker, 102C Sticker, 103A Number input field, 103B Number input field, 103C Number input Column, 104 OK button, 105 Cancel
Landscapes
- Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Health & Medical Sciences (AREA)
- Economics (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
非特許文献1に記載のシステムによると、ユーザの感情や動態に関連する映像等のデータの取得及び解析が、ユーザの許諾なしになされる場合がある。このため、ユーザによっては不快に感じる場合がある。このため、ユーザによっては、自身の感情を読み取られるという拒否感又は恐怖感などの抵抗感を引き起こす可能性がある。
本開示によれば、ユーザに違和感及び抵抗感を与えることなく感情を解析することができる。
最初に本開示の実施形態の概要を列記して説明する。
以下、本開示の実施形態について、図面を参照しながら説明する。なお、以下で説明する実施形態は、いずれも本開示の一具体例を示すものである。以下の実施形態で示される数値、形状、材料、構成要素、構成要素の配置位置及び接続形態、ステップ、ステップの順序などは、一例であり、本開示を限定するものではない。また、以下の実施形態における構成要素のうち、独立請求項に記載されていない構成要素については、任意に付加可能な構成要素である。また、各図は、模式図であり、必ずしも厳密に図示されたものではない。
図1は、実施形態に係る感情解析システムの全体構成の一例を示す図である。
図2は、実施形態に係る感情解析システム1を構成するセンサ3及びサーバ4のそれぞれの構成の一例を示すブロック図である。
図6は、実施形態に係るスマートフォン5の構成の一例を示すブロック図である。
以下、感情解析システム1が実行する各処理について説明する。
ユーザ登録処理が完了した後、ユーザ2は、自身及びその同行者が参加するイベントの登録処理を実行する。
ユーザ2は、センサ3を受け取った後に、センサ3のセンサIDと、センサ3を利用するユーザ2のユーザIDとを紐づけるための紐づけ処理を実行する。
センサIDとユーザIDの紐づけ処理の後、ユーザ2は、センサ3の利用を開始することができる。
以上説明したように、本開示の実施形態によると、センサ3をユーザ2の体表面に貼ることによりユーザ2を計測対象としたひずみ量がサーバ4へ提供されることを了承した上で、ユーザ2はセンサ3を顔等の体表面に貼ることができる。このため、ユーザ2が知らないうちにユーザ2のひずみ量がサーバ4に提供されることを防止することができる。また、センサ3はシート状であることより、ユーザ2は顔にシールを貼る感覚でセンサ3を装着することができる。特に、スポーツ観戦会場やアトラクション会場などにおいては、ユーザ2は、顔にシールを貼ることへの抵抗感が少ない。よって、サーバ4は、ユーザ2に違和感及び抵抗感を与えることなくひずみ量を収集し、ユーザ2の感情を解析することができる。
図21は、変形例に係るスマートフォン5の構成の一例を示すブロック図である。
上述の実施形態ではセンサ3で計測されたひずみ量をスマートフォン5を経由してサーバ4に送信することとしたが、スマートフォン5を経由せずにサーバ4に送信することとしてもよい。例えば、会場のエリアごとにアクセスポイントを設け、センサ3は、アクセスポイントと通信を行い、ひずみ量及びセンサIDの組データをアクセスポイントに送信する。アクセスポイントは、センサ3から受信した組データを、ネットワーク6を介してサーバ4に送信する。
上述の実施形態では、スマートフォン5は、サーバ4に時刻情報を送信しないとしたが、センサ3から受信した組データを纏めてサーバ4に送信する場合には、組データと合わせて組データの受信時刻をサーバ4に送信してもよい。
Claims (12)
- 接触面を有するセンサであって当該接触面が接触する面の変化の度合いを示す物理量を計測する前記センサがユーザの体表面に接触していることが、前記物理量を前記ユーザ以外の他者に提供することへの許諾の条件となることを、前記ユーザに提示する提示部と、
前記物理量に基づいて、前記センサの前記体表面への接触状態を推定する推定部と、
前記推定部による前記接触状態の推定結果に基づいて、前記物理量の前記他者への提供を開始する提供部と、を備える提供システム。 - 前記推定部は、前記物理量と第1閾値との比較結果に基づいて、前記センサが前記体表面に接触していることを推定し、
前記提供部は、前記推定部により前記センサが前記体表面へ接触したと推定された場合に、前記物理量の前記他者への提供を開始する、請求項1に記載の提供システム。 - 前記推定部は、前記物理量と前記第1閾値との比較結果の持続時間に基づいて、前記センサが前記体表面に接触していることを推定する、請求項2に記載の提供システム。
- 前記推定部は、前記物理量と第2閾値との比較結果に基づいて、前記センサが前記体表面に接触していないことを推定し、
前記提供部は、前記推定部により前記センサが前記体表面に接触していないと推定された場合に、前記物理量の前記他者への提供を終了する、請求項1から請求項3のいずれか1項に記載の提供システム。 - 前記推定部は、前記物理量と前記第2閾値との比較結果の持続時間に基づいて、前記センサが前記体表面から接触していないことを推定する、請求項4に記載の提供システム。
- 前記提供システムは、
前記センサを識別するセンサ識別子を登録する登録部と、
前記センサから、前記物理量と当該センサの前記センサ識別子との組を受信する物理量受信部とをさらに備え、
前記提供部は、前記物理量受信部が受信した前記組のうち、前記登録部が登録した前記センサ識別子を含む組を前記他者に提供する、請求項1から請求項5のいずれか1項に記載の提供システム。 - 前記提供部は、さらに、前記物理量受信部が受信した前記組のうち、前記登録部が登録した前記センサ識別子を含まない前記組を前記他者が利用する装置に送信しない、請求項6に記載の提供システム。
- 前記登録部は、さらに、前記ユーザを識別するユーザ識別子と前記センサ識別子とを前記他者が利用する装置に提供する、請求項6又は請求項7に記載の提供システム。
- 前記提供部は、さらに、前記物理量の計測位置情報を前記他者が利用する装置に提供する、請求項1から請求項8のいずれか1項に記載の提供システム。
- 前記提供システムは、前記物理量の提供の前記ユーザによる許否を設定する設定部をさらに備え、
前記提供部は、前記ユーザにより前記物理量の提供が許可されていない場合には、前記物理量の前記他者への提供を中止する、請求項1から請求項9のいずれか1項に記載の提供システム。 - 接触面を有するセンサであって当該接触面が接触する面の変化の度合いを示す物理量を計測する前記センサがユーザの体表面に接触していることが、前記物理量を前記ユーザ以外の他者に提供することへの許諾の条件となることを、前記ユーザに提示するステップと、
前記物理量に基づいて、前記センサの前記体表面への接触状態を推定するステップと、
前記接触状態の推定結果に基づいて、前記物理量の前記他者への提供を開始するステップと、を含む提供方法。 - コンピュータを、
接触面を有するセンサであって当該接触面が接触する面の変化の度合いを示す物理量を計測する前記センサがユーザの体表面に接触していることが、前記物理量を前記ユーザ以外の他者に提供することへの許諾の条件となることを、前記ユーザに提示する提示部、
前記物理量に基づいて、前記センサの前記体表面への接触状態を推定する推定部、及び、
前記推定部による前記接触状態の推定結果に基づいて、前記物理量の前記他者への提供を開始する提供部として機能させるためのコンピュータプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023550854A JPWO2023053278A1 (ja) | 2021-09-29 | 2021-09-29 | |
PCT/JP2021/035898 WO2023053278A1 (ja) | 2021-09-29 | 2021-09-29 | 提供システム、提供方法、及びコンピュータプログラム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/035898 WO2023053278A1 (ja) | 2021-09-29 | 2021-09-29 | 提供システム、提供方法、及びコンピュータプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023053278A1 true WO2023053278A1 (ja) | 2023-04-06 |
Family
ID=85781535
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/035898 WO2023053278A1 (ja) | 2021-09-29 | 2021-09-29 | 提供システム、提供方法、及びコンピュータプログラム |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2023053278A1 (ja) |
WO (1) | WO2023053278A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014027373A1 (ja) * | 2012-08-13 | 2014-02-20 | テルモ株式会社 | 体内水分計、体内水分計の制御方法、および記憶媒体 |
JP2014535114A (ja) * | 2011-11-09 | 2014-12-25 | コーニンクレッカ フィリップス エヌ ヴェ | データ・ネットワーク・サービスを介した感情共有のためのバイオセンサーの使用 |
WO2019073661A1 (ja) * | 2017-10-13 | 2019-04-18 | ソニー株式会社 | 情報処理装置、情報処理方法、情報処理システム、表示装置及び予約システム |
JP2021129891A (ja) * | 2020-02-21 | 2021-09-09 | 住友電気工業株式会社 | 端末装置、感情解析装置、感情解析システム、及び感情解析方法 |
-
2021
- 2021-09-29 WO PCT/JP2021/035898 patent/WO2023053278A1/ja active Application Filing
- 2021-09-29 JP JP2023550854A patent/JPWO2023053278A1/ja active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014535114A (ja) * | 2011-11-09 | 2014-12-25 | コーニンクレッカ フィリップス エヌ ヴェ | データ・ネットワーク・サービスを介した感情共有のためのバイオセンサーの使用 |
WO2014027373A1 (ja) * | 2012-08-13 | 2014-02-20 | テルモ株式会社 | 体内水分計、体内水分計の制御方法、および記憶媒体 |
WO2019073661A1 (ja) * | 2017-10-13 | 2019-04-18 | ソニー株式会社 | 情報処理装置、情報処理方法、情報処理システム、表示装置及び予約システム |
JP2021129891A (ja) * | 2020-02-21 | 2021-09-09 | 住友電気工業株式会社 | 端末装置、感情解析装置、感情解析システム、及び感情解析方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023053278A1 (ja) | 2023-04-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10595072B2 (en) | Systems and methods for recognizing faces using non-facial information | |
US11706601B2 (en) | Physiologic sensors for sensing, measuring, transmitting, and processing signals | |
US12074723B2 (en) | Information processing system, information processing device, information processing method, and recording medium | |
AU2007272434B2 (en) | Methods and systems for compliance confirmation and incentives | |
US20210128979A1 (en) | System and method for managing and tracking activity of a person | |
JP2017516170A (ja) | スマートウェアラブル装置間の近接性ベースのデータ交換及びユーザ認証 | |
US20110291953A1 (en) | Robot device and platform for social networking | |
CN104243546A (zh) | 信息处理设备、通信系统以及信息处理方法 | |
JP2019072486A (ja) | 電子装置 | |
KR20170060462A (ko) | 서비스 제공을 위한 웨어 시스템 및 방법 | |
WO2023053278A1 (ja) | 提供システム、提供方法、及びコンピュータプログラム | |
JP2021129891A (ja) | 端末装置、感情解析装置、感情解析システム、及び感情解析方法 | |
JPWO2020162038A1 (ja) | コミュニケーション方法、プログラム、記録媒体、及び通信システム | |
US20240356772A1 (en) | Information processing system, information processing device, information processing method, and recording medium | |
JP7395429B2 (ja) | 感情解析装置、感情解析システム、感情解析方法、及びコンピュータプログラム | |
Moreno | An approach of wearable computing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21959323 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023550854 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18695842 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21959323 Country of ref document: EP Kind code of ref document: A1 |