WO2016143415A1 - Appareil de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Appareil de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2016143415A1
WO2016143415A1 PCT/JP2016/053079 JP2016053079W WO2016143415A1 WO 2016143415 A1 WO2016143415 A1 WO 2016143415A1 JP 2016053079 W JP2016053079 W JP 2016053079W WO 2016143415 A1 WO2016143415 A1 WO 2016143415A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
user
information processing
stimulus
processing apparatus
Prior art date
Application number
PCT/JP2016/053079
Other languages
English (en)
Japanese (ja)
Inventor
雄 田中
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2016143415A1 publication Critical patent/WO2016143415A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • This disclosure relates to an information processing apparatus, an information processing method, and a program.
  • Patent Document 1 describes a technology that enables a process of awakening a driver before the driver feels sleepy. As described above, various techniques for applying an appropriate physical stimulus according to the user's condition have been proposed in recent years. In the technology described in Patent Document 1, the driver is awakened before the driver feels sleepy by changing the playback mode of the content based on the driving state of the car and the playback information of the content data being played back. Is intended.
  • Patent Document 1 it is conceivable to acquire, for example, a user's biological information using a sensor and determine whether to give a physical stimulus to the user based on the biological information.
  • a determinator that determines whether or not a physical stimulus is applied based on information indicating the state of the user is used. In this case, it is necessary to prepare an appropriate determinator in advance. It was.
  • a new and improved information processing apparatus and information processing method that make it possible to more easily determine whether to give a physical stimulus to a user based on information indicating the user's state. And suggest programs.
  • the information processing apparatus includes a determination unit that determines whether or not a physical stimulus is given to the user based on first data indicating a user's state, and the determination includes the first data Information processing apparatus that is executed using a determination unit that is generated based on second data that includes a result of determining whether or not to give the stimulus based on moving averages in time windows having different lengths Is provided.
  • the processor includes performing a determination as to whether or not to provide a physical stimulus to at least the user based on the first data indicating the state of the user, At least two lengths of the first data are executed using a determiner generated based on the second data including a result of determining whether to give the stimulus based on moving averages in different time windows.
  • the computer realizes a function of executing at least a determination as to whether or not to give a physical stimulus to the user based on the first data indicating the state of the user. At least two lengths of the first data are executed using a determiner generated based on the second data including a result of determining whether to give the stimulus based on moving averages in different time windows. Program is provided.
  • FIG. 5 is a flowchart illustrating an example of processing of a wearable terminal according to an embodiment of the present disclosure. It is a flowchart which shows the example of the output control process in the example of FIG. 4 is a flowchart illustrating an example of an initial output level determination process in the example of FIG. 3.
  • FIG. 5 is a diagram illustrating a first example in which an initial output level is determined based on a user profile in the process illustrated in FIG. 4.
  • FIG. 5 is a diagram showing a second example in which the initial output level is determined based on the user profile in the process shown in FIG. 4. In the process shown in FIG.
  • FIG. 4 it is a figure which shows the example which determines an initial output level based on an action recognition result.
  • FIG. 5 is a diagram showing an example of determining an initial output level based on a time zone in the process shown in FIG. 4. It is a flowchart which shows the example of the output level update process in the example of FIG. In the process shown in FIG. 9, it is a figure which shows the example which updates an output level based on the short-term moving average of sensor data. It is a figure showing the composition of the system concerning one embodiment of this indication, and the functional composition of the server contained in this system. 6 is a flowchart illustrating an example of processing of a server according to an embodiment of the present disclosure.
  • FIG. 3 is a block diagram illustrating a hardware configuration example of an information processing apparatus according to an embodiment of the present disclosure.
  • FIG. 1 is a diagram illustrating a system configuration according to an embodiment of the present disclosure and a functional configuration of a wearable terminal included in the system.
  • the system 10 includes a wearable terminal 100, a mobile terminal 200, and a server 300.
  • the wearable terminal 100 is, for example, eyewear, wristwear, neckwear, or the like, and is worn on the user's body.
  • the mobile terminal 200 is a smartphone or a tablet, for example, and is carried by the user.
  • the system 10 does not necessarily include the mobile terminal 200.
  • the server 300 is realized by one or a plurality of server devices on the network, and provides a service to the wearable terminal 100 and / or the mobile terminal 200.
  • one or a plurality of server devices constituting the wearable terminal 100, the mobile terminal 200, and the server 300 can be realized by a hardware configuration of an information processing device described later.
  • the wearable terminal 100 includes an output device 110, a processor 120, a sensor 130, an input device 140, a storage 150, and a communication device 160.
  • an output device 110 includes an input device 110, a processor 120, a sensor 130, an input device 140, a storage 150, and a communication device 160.
  • the output device 110 includes at least one device that provides physical stimulation to the user. More specifically, such a device includes, for example, a speaker that gives a stimulus by sound, a vibrator that gives a stimulus by vibration, and / or a lamp that gives a stimulus by light.
  • the output device 110 may further include a device that outputs information to the user. More specifically, such a device may include, for example, a display, a speaker, and / or a vibrator, and may be the same as the device that provides the physical stimulus described above.
  • the processor 120 is implemented as a CPU or the like, and realizes various functions by operating according to programs and data stored in a memory or storage.
  • the functions realized by the processor 120 include a control unit 122, a first determination unit 124, and a second determination unit 126.
  • the function realized by the processor 120 may further include an action recognition unit 128. Details of these functions will be described later.
  • the processor 120 controls the output device 110 based on the result of processing by these functions.
  • the processor 120 acquires an input for processing by the above function from the sensor 130 and the input device 140.
  • the processor 120 stores data related to processing in the storage 150 and reads out from the storage 150.
  • the processor 120 exchanges data regarding processing with the mobile terminal 200 via the communication device 160.
  • Sensor 130 includes at least one sensor that detects the state of the user wearing wearable terminal 100.
  • the sensor may be attached to the user. More specifically, such a sensor is, for example, a biometric sensor that measures a biological index such as a user's pulse or breath, blood pressure, sweating, blood flow, or the like, a sound generated in the vicinity of the user wearing the wearable terminal 100, The sound sensor and light sensor which detect light, and / or the acceleration sensor etc. which detect the acceleration concerning the wearable terminal 100 as the acceleration which has generate
  • sensor data acquired from the sensor 130 is an example of data indicating a user's state.
  • the input device 140 includes at least one device that accepts user input. More specifically, such an apparatus includes, for example, a button and a touch panel that accept operation input, a microphone that accepts voice input, and / or a camera that accepts gesture input.
  • the communication device 160 exchanges various data related to processing in the processor 120 with the mobile terminal 200.
  • the communication device 160 communicates with the mobile terminal 200 by wireless communication such as Bluetooth (registered trademark) or Wi-Fi.
  • Data transmitted / received between the communication devices 160 may be processed in the mobile terminal 200, or may be transferred from the mobile terminal 200 to the server 300 via the network and processed in the server 300.
  • FIG. 2 is a flowchart illustrating an example of processing of the wearable terminal according to an embodiment of the present disclosure.
  • the illustrated process is executed by the processor 120 in the wearable terminal 100 described above.
  • the processor 120 acquires sensor data from the sensor 130 (S101).
  • the sensor data is a detection result of the state of the user wearing the wearable terminal 100 by the sensor 130, and includes, for example, biological indicators such as the user's pulse, respiration, blood pressure, sweating, and blood flow.
  • the processor 120 determines whether or not the first determination unit 124 can use the determiner (S103).
  • the determiner is used to determine whether or not to provide a physical stimulus by the output device 110 based on the sensor data. For example, a result (a user when a stimulus is applied based on the sensor data in the past) Generated by machine learning based on whether there was a response or not.
  • the determiner is generated based on the stimulation trial results collected from the plurality of wearable terminals 100 by the server 300. Therefore, for example, after the start of the service, the determination device is not necessarily available until sufficient trial results are collected. Therefore, in S103 described above, it is determined whether or not the determiner is available.
  • the moving average of the sensor data is calculated.
  • the determination unit used by the first determination unit 124 does not necessarily include the moving average of the sensor data. You don't have to depend.
  • the first determiner 124 performs determination using the determiner (S105).
  • the second determination unit 126 performs determination using the moving average. More specifically, the second determination unit 126 calculates a short-term moving average and a long-term moving average for the sensor data acquired in S101 (S107). Furthermore, the second determination unit 126 determines whether or not to give a physical stimulus to the user based on the calculated moving average (S109).
  • the long-term moving average and the short-term moving average are two moving averages in time windows having different lengths. If the short-term moving average is the first moving average of the sensor data in the first time window, the long-term moving average is the second moving average of the sensor data in the second time window that is longer than the first time window. .
  • the long-term moving average and the short-term moving average are defined by the magnitude relationship of the length of each time window, and the specific length of each time window is not particularly limited. In the determination based on the moving average in the present embodiment, the moving average used is not limited to two. For example, three or more moving averages having different time window lengths may be used.
  • the second determination unit 126 should give a physical stimulus to the user when the decrease in the activity of the sympathetic nervous system is indicated by the relationship between the short-term moving average and the long-term moving average of the sensor data. It is determined that More specifically, the second determination unit 126 treats the sensor data so that the value increases as the activity of the user's sympathetic nervous system increases, and when the short-term moving average of the sensor data falls below the long-term moving average It may be determined that a physical stimulus should be given to the user.
  • the second determination unit 126 reverses the sign of the blood flow of the peripheral blood vessels and determines the above (determined that stimulation should be given when the short-term moving average falls below the long-term moving average) The determination itself may be reversed and it may be determined that stimulation should be given when the short-term moving average exceeds the long-term moving average.
  • the control unit 122 controls the output device 110 (S111). Details of the processing in S111 will be described later with reference to FIG. Thereafter, the processor 120 transmits data to the server 300 via the communication device 160 (S113).
  • the data to be transmitted includes, for example, the sensor data acquired in S101, the determination result in S105 or S109, the content of output control in S111 (such as the output duration or the temporal change in the output level), during output, or in output It may include sensor data or user feedback that is acquired later.
  • FIG. 3 is a flowchart illustrating an example of the output control process (S111) in the example of FIG.
  • the control unit 122 refers to the determination result (determination result in S105 or S109 shown in FIG. 2) by the first determination unit 124 or the second determination unit 126 (S121).
  • the control unit 122 determines whether or not an output for giving a stimulus to the user is instructed based on the determination result referred to in S121 (S123).
  • the control unit 122 determines an initial output level (S125).
  • the output level includes, for example, a sound pressure level, a pitch, a duration, and the like of the output sound when sound is output by a speaker included in the output device 110.
  • the amplitude, frequency, duration, and the like of the vibration are included.
  • the light intensity, color, flashing pattern, duration, and the like are included. Details of the processing in S125 will be described later with reference to FIG.
  • control unit 122 controls the output device 110 according to the initial output level determined in S125, and causes the user to execute an output for giving a physical stimulus (S127). Thereafter, the control unit 122 acquires sensor data from the sensor 130 or acquires user feedback via the input device 140 (S129).
  • sensor data or user feedback is used as a response to a physical stimulus given to the user by the output of S127.
  • the control unit 122 may acquire an explicit user response by the input device 140, or may interpret an increase in the activity of the sympathetic nervous system indicated by the sensor data as an implicit user response.
  • the control unit 122 determines whether or not there is a user response to the output of S127 based on the sensor data or user feedback acquired in S129 (S131).
  • the control unit 122 determines that there is a user response to the output.
  • the control unit 122 determines that there is a user response to the output when some input is acquired by the input device 140.
  • the control unit 122 outputs notification information that prompts the user to respond using a display, a speaker, or the like included in the output device 110, and determines that there is a user response to the output when an input to the notification information is acquired. May be.
  • the control unit 122 ends the output control process.
  • the control unit 122 updates the output level (S133), controls the output device 110 according to the updated output level,
  • the output for giving a simple stimulus is executed again (S127). Details of the processing in S127 will be described later with reference to FIG. 9, but in many cases, it is estimated that the stimulus is not effective, and thus the output level is increased by the update.
  • the output control by the control unit 122 is continued until it is determined that there is a user response to the output.
  • FIG. 4 is a flowchart showing an example of the initial output level determination process (S125) in the example of FIG.
  • the control unit 122 refers to the determination result (determination result in S105 or S109 shown in FIG. 2) by the first determination unit 124 or the second determination unit 126 (S141).
  • the control unit 122 designates the initial output level in the determination result referred to in S141 (since it has passed the determination of S123 in FIG. 3 as a premise, at least an output for giving a stimulus to the user is instructed). It is determined whether or not it has been performed (S143).
  • the initial output level is specified in the determination result, for example, when the determination using the determiner is executed in S105 shown in FIG.
  • the determiner used here is generated by machine learning based on the result when a stimulus is given based on sensor data in the past, for example. For example, if data such as the output duration for a stimulus and the temporal change in level are used for machine learning to generate a discriminator as well as whether or not a stimulus is given, sensor data input Thus, it may be possible to generate a determiner that outputs not only whether or not the output for stimulation should be executed, but also at what level the output should be executed. Also, when the determination using the moving average is executed in S109 shown in FIG.
  • the initial output level is designated in the determination result according to the difference between the short-term moving average and the long-term moving average, for example. (For example, when the difference is large, the activity of the sympathetic nervous system may be drastically reduced, so a high initial output level may be designated).
  • the control unit 122 sets the initial output level specified in the determination result (S145). On the other hand, when the initial output level is not specified (NO), the control unit 122 can set a predetermined initial output level. However, in the illustrated example, the control unit 122 sets the storage 150. (Or the mobile terminal 200 or the server 300) acquires a user profile, acquires a behavior recognition result from the behavior recognition unit 128 (S149), and either or both of the user attribute indicated by the user profile and / or the behavior recognition result The initial output level is determined based on (S149). Here, the control unit 122 may determine the initial output level based on the time zone. More specific examples of such processing will be further described below with reference to FIGS.
  • FIG. 5 is a diagram showing a first example in which the initial output level is determined based on the user profile in the processing shown in FIG.
  • a sound pressure level that varies depending on whether the gender indicated by the user profile is male or female is an initial output level. It is set as.
  • the initial output level is higher for women than for men.
  • FIG. 6 is a diagram showing a second example in which the initial output level is determined based on the user profile in the processing shown in FIG.
  • a different sound pressure level is set as an initial output level depending on the age indicated by the user profile.
  • the initial output level increases in proportion to the age for those in their 20s and above.
  • FIG. 7 is a diagram showing an example of determining the initial output level based on the action recognition result in the process shown in FIG.
  • a different sound pressure level is set as an initial output level for each action recognition result acquired from the action recognition unit 128. ing.
  • the initial output level is higher than in other cases.
  • the action recognition unit 128 recognizes the action of the user based on sensor data acquired from the sensor 130, for example.
  • the sensor 130 is used for action recognition such as an angular velocity sensor or a position sensor (for example, a GPS receiver or a Wi-Fi communication device) in addition to the above-described biological sensor, sound sensor, optical sensor, acceleration sensor, and the like.
  • Various types of sensors that acquire the sensor data can be included.
  • various known techniques described in, for example, Japanese Patent Application Laid-Open No. 2013-3649 can be used, and detailed description thereof is omitted.
  • the action recognition unit 128 uses such a technique, for example, as in the example shown in FIG. 7, walking, riding a bicycle (riding), a bus (riding), and a car (riding a car). Can be recognized.
  • the action recognition unit 128 does not necessarily have to be mounted on the wearable terminal 100, and may be mounted on the mobile terminal 200 or the server 300, for example.
  • the processor 120 transmits the sensor data acquired from, for example, the sensor 130 to the mobile terminal 200 or the server 300 via the communication device 160, and receives the result of action recognition by these devices.
  • FIG. 8 is a diagram showing an example of determining the initial output level based on the time zone in the process shown in FIG.
  • a different sound pressure level for each time zone is set as an initial output level.
  • the initial output level is higher in the time zone from 22:00 to 5:00 when the user is assumed to be sleeping than in other time zones.
  • the time zone in which the initial output level is set high in this way may be determined based on, for example, knowledge of a general user's life time zone, or the user's sleeping time recognized by the behavior recognition unit 128. It may be determined based on a time zone pattern.
  • FIG. 9 is a flowchart illustrating an example of the output level update process (S133) in the example of FIG.
  • the control unit 122 refers to the determination result (determination result in S105 or S109 shown in FIG. 2) by the first determination unit 124 or the second determination unit 126 (S161).
  • the control unit 122 updates the output level update pattern in the determination result referred to in S161 (as a premise, since the determination in S123 of FIG. 3 has been passed, at least an output for giving a stimulus to the user is instructed). Is determined (S163).
  • the output level update pattern is specified in the determination result, for example, when the determination using the determination device is executed in S105 shown in FIG.
  • the determiner used here is generated by machine learning based on the result when a stimulus is given based on sensor data in the past, for example. For example, if data such as the output duration for a stimulus and the temporal change in level are used for machine learning to generate a discriminator as well as whether or not a stimulus is given, sensor data input Thus, it may be possible to generate a determinator that outputs not only whether or not the output for stimulation should be performed, but also how the output level should be updated when no response from the user is obtained. . Also, when the determination using the moving average is executed in S109 shown in FIG.
  • the output level update pattern is specified in the determination result according to the difference between the short-term moving average and the long-term moving average, for example. (For example, when the difference is large, the activity of the sympathetic nervous system may be drastically reduced, so if the response cannot be obtained, the output level is greatly increased in a short time.) May be specified).
  • the control unit 122 updates the output level with the pattern specified in the determination result (S165).
  • the control unit 122 can update the output level according to a predetermined pattern, but in the illustrated example, the control unit 122 is updated.
  • FIG. 10 is a diagram showing an example of updating the output level based on the short-term moving average of the sensor data in the process shown in FIG.
  • a response is obtained as the magnitude (absolute value) of the short-term moving average differential value of the sensor data is larger. If not, the output level is greatly increased in a short time.
  • the derivative value of the short-term moving average is a large negative value (A)
  • the derivative value of the short-term moving average is a small negative value (B)
  • the rate at which the output level is raised is large.
  • the sound pressure level in the case where a stimulus by voice is output from the speaker included in the output device 110 of the wearable terminal 100 is illustrated, but such processing is performed at the sound pressure level.
  • the pitch and duration of the sound (or the time until the output level is updated when there is no response; the same applies to the following durations) are the same as in the above example.
  • the initial value and update pattern may be controlled.
  • the initial value and the update pattern may be controlled in the same manner as in the above example for the amplitude, frequency, and duration of vibration.
  • the initial value and the update pattern are controlled in the same manner as in the above example for the light intensity, color, blinking pattern, duration, etc. Also good.
  • FIG. 11 is a diagram illustrating a configuration of a system according to an embodiment of the present disclosure and a functional configuration of a server included in the system. 11 shows the same system 10 as that shown in FIG. 1, but differs from FIG. 1 in that attention is paid to the functional configuration of the server 300.
  • the wearable terminal 100 shown in FIG. 1 is shown as eyewear 100a and listware 100b.
  • the system 10 may include a plurality of wearable terminals 100.
  • the server 300 includes a communication device 310, a processor 320, and a storage 330. Hereinafter, each functional configuration will be further described.
  • the communication device 310 exchanges various data related to processing in the processor 320 with the wearable terminal 100 (eyewear 100a and listware 100b; the same applies hereinafter) via the mobile terminal 200.
  • the communication device 310 gives the user a stimulus based on the data transmitted by the process of S113 shown in FIG. 2 above, more specifically, for example, the sensor data acquired from the sensor 130 or the sensor data.
  • the contents of the output control based on the determination result (such as the output duration or the temporal change in the output level), sensor data acquired during or during output, or user feedback are received.
  • the communication device 310 transmits to the wearable terminal 100 the data of the determiner generated by the determiner generation unit 322 included in the processor 320 based on the received data as described above. As described above, the communication device 310 may be able to communicate with the wearable terminal 100 without using the mobile terminal 200.
  • the processor 320 is implemented as a CPU or the like, and realizes various functions by operating according to programs and data stored in a memory or storage.
  • the function realized by the processor 320 includes a determiner generation unit 322.
  • the determinator generating unit 322 generates a determinator that receives at least sensor data and outputs whether or not a stimulus is given to the user.
  • the determiner is generated by machine learning based on a result when the wearable terminal 100 gives a stimulus based on sensor data, for example.
  • the processor 320 stores data related to processing in the storage 330 and reads out from the storage 330. Further, as described above, the processor 320 exchanges processing-related data with the wearable terminal 100 via the communication device 310.
  • FIG. 12 is a flowchart illustrating an example of server processing according to an embodiment of the present disclosure.
  • the illustrated process is executed by the processor 320 in the server 300 described above.
  • the processor 320 receives data from the wearable terminal 100 via the communication device 310 (S301).
  • the data transmitted by the wearable terminal 100 is based on, for example, sensor data acquired from the sensor 130, determination results on whether or not to give a stimulus to the user based on the sensor data, and determination results.
  • the contents of output control (such as the duration of output and temporal change in output level), sensor data or user feedback acquired during or at the time of output are included.
  • the processor 320 determines whether or not a determiner has already been generated (S303). When the determiner is generated, the data of the determiner is stored in the storage 330, for example. If the determiner has already been generated (YES), the processor 320 updates the determiner based on the data received in S301 (S305), and the updated determiner via the communication device 310. Is transmitted to the wearable terminal 100 (S313). Note that the update process of the determinator and the transmission of the determinator after the update as described above do not necessarily have to be performed every time data is received from the wearable terminal 100. For example, the update of the determiner and the transmission process of the updated determiner may be performed when the received data is accumulated in the storage 330 and the accumulated data reaches a predetermined number, It may be executed at a predetermined cycle separately from the reception of data.
  • the processor 320 determines whether or not a determiner can be generated based on the data accumulated so far (S309).
  • the determination unit performs a stimulation trial result collected from a plurality of wearable terminals 100, more specifically, for example, a result when stimulation is given based on sensor data in the past ( Generated by machine learning based on whether there was a response from the user or not. Thus, for example, after the start of the service, until a sufficient trial result is collected, the determiner cannot always be generated.
  • the determiner generation unit 322 determines the number of accumulated data and the distribution of input and output (for example, only the result when stimuli are similarly applied to similar sensor data) Or the like is not generated), it is determined whether or not a sufficiently accurate determiner can be generated.
  • the determiner generation unit 322 In the determination of S309, when it is determined that the determiner can be generated (YES), the determiner generation unit 322 generates a determiner using the accumulated data (S311). The processor 320 transmits the generated data of the determiner to the wearable terminal 100 via the communication device 310 (S313). It should be noted that the process of creating a determiner as described above does not necessarily have to be executed when data is received from wearable terminal 100. For example, the determination as to whether or not the determinator can be generated in S309 and the generation of the determinator in S311 when the determinator can be generated include the data accumulated in the storage 330 in S307 to a predetermined number.
  • the transmission of the data of the determiner in S313 may be executed, for example, when the determiner is generated, or may be executed when data is further received from the wearable terminal 100 thereafter.
  • the determiner generated by the determiner generation unit 322 of the processor 320 is not only whether or not the output for stimulation should be executed on the input of sensor data, for example. It may be possible to output an appropriate initial output level.
  • the determiner may be capable of outputting how to update the output level when a response from the user cannot be obtained in response to sensor data input. For example, when there are a plurality of items that are supposed to be output by the determiner, the processes of S303 to S311 can be executed for each item.
  • the determinator is initial If it is not possible to output the output level (NO in S305 for this item), whether or not a determinator capable of outputting the initial output level can be generated based on the data accumulated in S307. (S309) is executed, and if it can be generated, a determiner capable of outputting the initial output level is generated (S311). The same applies to the output level update pattern.
  • the output device 110 gives a physical stimulus to the user based on the sensor data acquired from the sensor 130.
  • the determination unit generated by the determination unit generation unit 322 in the server 300 is used to determine whether or not the stimulus to be applied is performed in the processor 120.
  • the processor 120 of the wearable terminal 100 includes a first determination unit 124 that executes determination using a determiner. Further, the processor 120 may move the sensor data if the determiner is not available, more specifically if the determiner has not yet been generated because sufficient trial results have not been collected at the server 300.
  • a second determination unit 126 is included that determines whether or not to provide a stimulus based on the average.
  • the wearable terminal 100 determines, for example, whether or not to give a stimulus with a certain degree of accuracy even after the service is started, even in a stage where the determiner is not available. Is possible.
  • the stimulation output based on the determination of the second determination unit 126 is performed, the sensor data used for the determination, the content of the output, and the output result (sensor data or user feedback acquired during or after output) ) Is transmitted from the wearable terminal 100 to the server 300.
  • the determiner generation unit 322 based on the data received from the wearable terminal 100, the determiner generation unit 322 generates a determiner for determining whether or not to give a physical stimulus to the user based on the sensor data. To do. As described above, for example, at the stage where sufficient trial results are not collected, the determiner is not necessarily generated. In this case, in the wearable terminal 100, the second determination unit 126 determines whether to give a stimulus based on the moving average of the sensor data, and the output of the stimulus is executed according to the result. The server 300 receives the sensor data when the stimulus is actually given / not given as described above, the content of the given stimulus, and data indicating the result from the wearable terminal 100, and accumulates the data.
  • a determiner can be generated. For example, as compared with the case where only sensor data is accumulated, even when the accuracy is not sufficient, the data when the stimulus is actually given / not given based on the sensor data is accumulated and used for generation of the determination device. Thus, for example, it may be possible to generate a determiner having a certain degree of accuracy by collecting relatively few trial results.
  • the determiner generated by the determiner generation unit 322 in the server 300 not only determines whether or not to give a stimulus based on the sensor data, but also outputs an appropriate initial output level and an update pattern of the output level. It may be possible. For these outputs, for example, it may be necessary to accumulate more trial results than when determining whether or not to provide a stimulus. In the present embodiment, for example, whether or not a stimulus is applied in the wearable terminal 100. Since it is possible to collect the trial output regarding the initial output level and the update pattern of the output level already before the availability of a simple determinator that determines only the initial output level, the initial output level and A determination device capable of outputting an output level update pattern can be generated.
  • the system 10 includes the wearable terminal 100, the mobile terminal 200, and the server 300, but the embodiment of the present disclosure is not limited to such an example.
  • the system 10 may not include the mobile terminal 200.
  • the function of the determiner generation unit 322 in the server 300 can be realized by a processor and storage provided in the mobile terminal 200. Therefore, the system 10 may not include the server 300, and the mobile terminal 200 may realize the same function as that described above as the function of the server 300.
  • the mobile terminal 200 may generate the determiner based on data collected from one or a plurality of wearable terminals 100 used by the same user as the mobile terminal 200, for example.
  • the wearable terminal 100 when the wearable terminal 100 has a high information processing capability, the function similar to that described above as the function of the server 300 is realized in the wearable terminal 100, and the system 10 includes both the mobile terminal 200 and the server 300. It does not have to be.
  • the wearable terminal 100 may generate a determiner based on data collected by itself, for example, data collected from one or more other wearable terminals used by the same user (this For communication, the system 10 may include a mobile terminal 200).
  • the configuration of the wearable terminal 100 can be simplified as much as possible, and the information processing functions can be integrated into the mobile terminal 200 or the server 300.
  • the first determination unit 124 realized by the processor 120 of the wearable terminal 100 may be realized by the processor 320 of the server 300.
  • sensor data first data indicating a user state
  • the first determination unit 124 performs determination based on the sensor data using the determiner, and the determination result is transmitted to the wearable terminal 100 via the communication device 310.
  • control unit 122 controls output device 110 according to the determination result.
  • control unit 122 and / or the second determination unit 126 can also be realized by the processor 320 of the server 300.
  • the server 300 in the above configuration example may be replaced with the mobile terminal 200. That is, in the mobile terminal 200, the first determination unit 124, the control unit 122, and / or the second determination unit 126 may be realized by a processor.
  • the wearable terminal 100 described above with reference to FIG. 1 and the like has a first determination unit based on a moving average of sensor data, in addition to the first determination unit 124 that performs determination using a determiner. 2 determination units 126.
  • the result of determination or determination by the second determination unit 126 is eventually sent to the server 300 via the communication device 160 as data (second data) for generating a determination unit used by the first determination unit 124.
  • the wearable terminal having the first determination unit 124 does not necessarily have the second determination unit 126. That is, in another example, the wearable terminal does not need to have the second determination unit 126 while having the first determination unit 124.
  • the determination device generated based on the second data provided by another wearable terminal is provided from the server 300 that the determination based on the sensor data in the wearable terminal is performed. It becomes possible.
  • the second determination unit 126 may be implemented in the wearable terminal 100 or the mobile terminal 200. In this case, in the server 300, the first determination unit 124 and the determination unit generation unit 322 are implemented.
  • FIG. 13 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure.
  • the illustrated information processing apparatus 900 can realize, for example, a server, a mobile terminal, or a wearable terminal in the above-described embodiment.
  • the information processing apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
  • the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
  • the information processing apparatus 900 may include an imaging device 933 and a sensor 935 as necessary.
  • the information processing apparatus 900 may include a processing circuit such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Gate Array) instead of or in addition to the CPU 901.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • the CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
  • the ROM 903 stores programs and calculation parameters used by the CPU 901.
  • the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
  • the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
  • PCI Peripheral Component Interconnect / Interface
  • the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
  • the input device 915 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the information processing device 900.
  • the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
  • the output device 917 is configured by a device capable of notifying the acquired information to the user using a sense such as vision, hearing, or touch.
  • the output device 917 can be, for example, a display device such as an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) display, an audio output device such as a speaker or headphones, or a vibrator.
  • the output device 917 outputs the result obtained by the processing of the information processing device 900 as video such as text or image, sound such as sound or sound, or vibration.
  • the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900.
  • the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage device 919 stores, for example, programs executed by the CPU 901 and various data, and various data acquired from the outside.
  • the drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900.
  • the drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905.
  • the drive 921 writes a record in the attached removable recording medium 927.
  • the connection port 923 is a port for connecting a device to the information processing apparatus 900.
  • the connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
  • the connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
  • the communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931.
  • the communication device 925 can be, for example, a communication card for LAN (Local Area Network), Bluetooth (registered trademark), Wi-Fi, or WUSB (Wireless USB).
  • the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
  • the communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
  • the communication network 931 connected to the communication device 925 is a network connected by wire or wireless, and may include, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like.
  • the imaging device 933 uses various members such as an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image.
  • the imaging device 933 may capture a still image or may capture a moving image.
  • the sensor 935 is various sensors such as an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, an illuminance sensor, a temperature sensor, an atmospheric pressure sensor, or a sound sensor (microphone).
  • the sensor 935 acquires information about the state of the information processing apparatus 900 itself, such as the posture of the information processing apparatus 900, and information about the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900, for example. To do.
  • the sensor 935 may include a GPS receiver that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the device.
  • GPS Global Positioning System
  • Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
  • Embodiments of the present disclosure function, for example, an information processing apparatus (server, mobile terminal, or wearable terminal) as described above, a system, an information processing apparatus or an information processing method executed by the system, and an information processing apparatus And a non-transitory tangible medium on which the program is recorded.
  • an information processing apparatus server, mobile terminal, or wearable terminal
  • a system an information processing apparatus or an information processing method executed by the system
  • an information processing apparatus And a non-transitory tangible medium on which the program is recorded.
  • a determination unit that performs determination of whether or not to give a physical stimulus to at least the user based on first data indicating the state of the user, The determination is generated based on second data including a result of determining whether to give the stimulus based on moving averages in time windows in which at least two lengths of the first data are different.
  • An information processing apparatus that is executed using a computer.
  • the moving average includes a first moving average of the first data in a first time window and a second value of the first data in a second time window longer than the first time window.
  • the information processing apparatus according to (1) including a moving average of.
  • the first data is handled such that the value increases as the sympathetic nervous system activity of the user increases.
  • the information processing apparatus wherein the second data includes a result of determining to give the stimulus to the user when the first moving average is lower than the second moving average. .
  • the second data further includes a level of the stimulus given to the user when it is determined to give the stimulus to the user,
  • the information processing apparatus according to any one of (1) to (3), wherein the determination unit further determines the level when the stimulus is applied based on the first data.
  • the information processing apparatus according to (4), wherein the second data includes the level determined according to the attribute of the user.
  • the information processing apparatus according to (4) or (5), wherein the second data includes the level determined in accordance with the action recognition result of the user.
  • the information processing apparatus includes the level determined according to a time zone.
  • the second data further includes a temporal change in the stimulus given to the user when it is determined to give a physical stimulus to the user,
  • the information processing apparatus according to any one of (1) to (7), wherein the determination unit further determines the temporal change pattern when the stimulus is applied based on the first data. .
  • the information processing apparatus according to (8), wherein the second data includes the temporal change determined based on the moving average. (10)
  • the second data is determined to give a physical stimulus to the user, the first data acquired during or after the stimulus is given.
  • the information processing apparatus according to any one of (1) to (9), further including data or feedback from the user.
  • the information processing apparatus according to any one of (1) to (10), wherein the determination unit is generated by machine learning based on the second data.
  • the first data includes sensor data acquired from a sensor worn by the user.
  • the information processing apparatus according to any one of (1) to (12), further including a determiner generation unit configured to generate the determiner.
  • the processor includes performing a determination as to whether or not to provide at least a physical stimulus to the user based on the first data indicating the state of the user; The determination is generated based on second data including a result of determining whether to give the stimulus based on moving averages in time windows in which at least two lengths of the first data are different.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Traffic Control Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente invention permet de déterminer plus simplement si un stimulus physique doit être appliqué à un utilisateur sur la base d'informations indiquant l'état de l'utilisateur. L'invention concerne un appareil de traitement d'informations comprenant une section de génération de déterminateur qui génère, sur la base de secondes données incluant un résultat issu d'une détermination qui précise si un stimulus physique doit être appliqué à l'utilisateur en fonction d'au moins deux moyennes mobiles de premières données indiquant l'état de l'utilisateur dans des fenêtres temporelles qui ont des longueurs différentes, un déterminateur destiné à déterminer, au minimum, si le stimulus doit être appliqué sur la base des premières données.
PCT/JP2016/053079 2015-03-12 2016-02-02 Appareil de traitement d'informations, procédé de traitement d'informations et programme WO2016143415A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015049397A JP2016170589A (ja) 2015-03-12 2015-03-12 情報処理装置、情報処理方法およびプログラム
JP2015-049397 2015-03-12

Publications (1)

Publication Number Publication Date
WO2016143415A1 true WO2016143415A1 (fr) 2016-09-15

Family

ID=56879431

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/053079 WO2016143415A1 (fr) 2015-03-12 2016-02-02 Appareil de traitement d'informations, procédé de traitement d'informations et programme

Country Status (2)

Country Link
JP (1) JP2016170589A (fr)
WO (1) WO2016143415A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018063490A (ja) * 2016-10-11 2018-04-19 株式会社東海理化電機製作所 触覚呈示装置

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017199269A (ja) * 2016-04-28 2017-11-02 株式会社デンソー 車載機器制御装置
JP7386137B2 (ja) * 2019-08-30 2023-11-24 任天堂株式会社 情報処理システム、情報処理プログラム、情報処理方法、および、情報処理装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0648099U (ja) * 1992-11-25 1994-06-28 株式会社小糸製作所 車両用レーダ装置
JP2005216325A (ja) * 2005-04-06 2005-08-11 Pioneer Electronic Corp 覚醒装置
JP2008206688A (ja) * 2007-02-26 2008-09-11 Denso Corp 居眠り警報装置
JP2010273752A (ja) * 2009-05-27 2010-12-09 Kamata Toru 入眠判定システム
JP2011159108A (ja) * 2010-02-01 2011-08-18 Denso Corp 覚醒支援装置
US20120296226A1 (en) * 2011-05-17 2012-11-22 Industrial Technology Research Institute Predictive drowsiness alarm method
JP2013171546A (ja) * 2012-02-23 2013-09-02 Sony Corp 再生装置、再生方法およびプログラム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0648099U (ja) * 1992-11-25 1994-06-28 株式会社小糸製作所 車両用レーダ装置
JP2005216325A (ja) * 2005-04-06 2005-08-11 Pioneer Electronic Corp 覚醒装置
JP2008206688A (ja) * 2007-02-26 2008-09-11 Denso Corp 居眠り警報装置
JP2010273752A (ja) * 2009-05-27 2010-12-09 Kamata Toru 入眠判定システム
JP2011159108A (ja) * 2010-02-01 2011-08-18 Denso Corp 覚醒支援装置
US20120296226A1 (en) * 2011-05-17 2012-11-22 Industrial Technology Research Institute Predictive drowsiness alarm method
JP2013171546A (ja) * 2012-02-23 2013-09-02 Sony Corp 再生装置、再生方法およびプログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018063490A (ja) * 2016-10-11 2018-04-19 株式会社東海理化電機製作所 触覚呈示装置
WO2018070275A1 (fr) * 2016-10-11 2018-04-19 株式会社東海理化電機製作所 Dispositif de présentation tactile

Also Published As

Publication number Publication date
JP2016170589A (ja) 2016-09-23

Similar Documents

Publication Publication Date Title
US10366778B2 (en) Method and device for processing content based on bio-signals
CN110874129B (zh) 显示系统
JP6760271B2 (ja) 情報処理装置、情報処理方法およびプログラム
US10342428B2 (en) Monitoring pulse transmissions using radar
CN105015552B (zh) 驾驶员状态监测系统及其控制方法
CN107193382B (zh) 智能可穿戴设备以及自动利用传感器来配置能力的方法
JP6361649B2 (ja) 情報処理装置、通知状態制御方法及びプログラム
US9492630B2 (en) Wearable computing device and user interface method
US20190019512A1 (en) Information processing device, method of information processing, and program
JP6402718B2 (ja) 情報処理装置、制御方法およびプログラム
US20220129534A1 (en) Electronic authentication system, device and process
WO2016088413A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
WO2016157641A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2015118185A (ja) 情報処理装置、情報処理方法、およびプログラム
JP7384154B2 (ja) 情報処理装置、情報処理方法、情報処理プログラム、端末装置、端末装置の制御方法および制御プログラム
US11169599B2 (en) Information processing apparatus, information processing method, and program
WO2016143415A1 (fr) Appareil de traitement d'informations, procédé de traitement d'informations et programme
KR102251710B1 (ko) 글라스형 웨어러블 디바이스를 이용한 외부디바이스 내 콘텐츠 관리 시스템, 방법 및 컴퓨터로 독출 가능한 기록매체
US10643636B2 (en) Information processing apparatus, information processing method, and program
WO2016199457A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US20210295049A1 (en) Information processing apparatus, information processing method, and program
CN111344776B (zh) 信息处理装置、信息处理方法和程序
US20210256263A1 (en) Information processing apparatus, information processing method, and program
WO2015194215A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2015198672A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16761387

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16761387

Country of ref document: EP

Kind code of ref document: A1