WO2019026396A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2019026396A1
WO2019026396A1 PCT/JP2018/019663 JP2018019663W WO2019026396A1 WO 2019026396 A1 WO2019026396 A1 WO 2019026396A1 JP 2018019663 W JP2018019663 W JP 2018019663W WO 2019026396 A1 WO2019026396 A1 WO 2019026396A1
Authority
WO
WIPO (PCT)
Prior art keywords
option
information processing
control unit
output control
user
Prior art date
Application number
PCT/JP2018/019663
Other languages
English (en)
Japanese (ja)
Inventor
真里 斎藤
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2019026396A1 publication Critical patent/WO2019026396A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/335Filtering based on additional data, e.g. user or group profiles

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a program.
  • Patent Document 1 discloses a technique for presenting an appropriate meal menu according to the health condition of the user.
  • the present disclosure proposes a new and improved information processing apparatus, information processing method, and program capable of naturally guiding an appropriate selection by a user.
  • the second option and the third option are acquired based on the category value of the first option estimated from the user's behavior, the first option, the second option, and An output control unit configured to control an output of the third option, wherein the output control unit is configured such that the category value of the second option is the category value of the first option and the category value of the third option
  • An information processing apparatus is provided which acquires the second option and the third option so as to be located between the category value.
  • the processor obtains a second option and a third option based on the category value of the first option estimated from the user's behavior, and the first option, Controlling an output of the second option and the third option, wherein the controlling is performed by the category value of the second option being the category value of the first option and the third value of the third option.
  • An information processing method is provided, further comprising obtaining the second option and the third option so as to be located between the option value and the category value.
  • the computer acquires the second option and the third option based on the category value of the first option estimated from the user's behavior, and the first option, the second option
  • An output control unit configured to control an output of the second option and the third option, the output control unit further comprising: the category value of the second option being the category value of the first option and the category value of the first option
  • a program for functioning as an information processing apparatus which acquires the second option and the third option so as to be located between the third option and the category value.
  • Embodiment 1.1 Outline of embodiment 1.2.
  • System configuration example 1.3 Functional configuration example of information processing terminal 10 1.4.
  • Functional configuration example of information processing server 20 1.5.
  • the apparatus as described above also includes, for example, an apparatus that provides a presentation to help the user achieve the goal.
  • a device that assists in achieving the goal may, for example, recommend a meal with less calories, recommend action to secure more study time, recommend item or action with less expense, etc. it can.
  • FIG. 1 is a diagram for describing an overview of an embodiment of the present disclosure.
  • the upper part of FIG. 1 shows a situation where the conventional device presents an appropriate meal menu recommended to the user U1 to the user U1 aiming to lose weight.
  • the conventional device detects from the speech UO1 by the user U1 that the user U1 assumes a beef as a meal option, together with the first option O1 desired by the user , And a second option O2 for recommending a chicken with less calories is displayed on the display unit 910.
  • the conventional device outputs a speech utterance SO1 explicitly indicating that the second option is superior to the first option desired by the user.
  • an information processing server that executes processing based on an information processing method according to an embodiment of the present disclosure determines a second option and a second option based on the category value of the first option estimated from the user's action. 3 options are acquired, and the outputs of the first to third options are controlled.
  • the information processing server according to the present embodiment is configured such that the category value of the second option is located between the category value of the first option and the category value of the third option, Obtaining a third option is one of the features.
  • FIG. 1 shows a scene where the information processing server according to the present embodiment presents an appropriate meal menu recommended to the user U1.
  • the information processing server first specifies a first option O1 which is an option desired by the user U1 based on the utterance UO1 of the user U1. Subsequently, the information processing server acquires a second option O2 having a category value lower than that of the first option O1.
  • the above category value is set for each category of goal, and indicates a numerical value that is an important factor for achieving the goal.
  • the calorie corresponds to the category value. Therefore, the information processing server according to the present embodiment acquires a chicken with a calorie lower than that of the first option O1 as a second option O2.
  • the information processing server acquires a fish having a calorie lower than that of the second option O2 as the third option O3 and causes the display unit 110 to display the first to third options O1 to O3. .
  • the information processing server can use the above tendency to guide the user U1 to select the more appropriate second option O2 for achieving the goal.
  • the information processing server causes the display unit 110 to display the first to third options O1 to O3 and causes the voice utterance SO2 to be output according to the first to third options O1 to O3. It is also good.
  • the information processing server improves the attraction effect of the second option by using a modifier that is attractive to the user U1, such as "condensed umami,” without explicitly expressing the calorie. It can be done.
  • the information processing server According to the information processing server according to the present embodiment, it is possible to naturally guide the user to perform more appropriate selection without impairing the mood of the user by explicit expression. Therefore, according to the information processing server according to the present embodiment, the effect of effectively assisting the achievement of the user's goal is expected while maintaining the user's motivation high.
  • FIG. 1 shows an example in which the information processing server causes the first to third options O1 to O3 to be displayed one by one
  • the first to third options O1 to O3 according to the present embodiment are related to this. It is not limited to the example.
  • the information processing server according to the present embodiment may obtain a plurality of first to third options O1 to O3 respectively and cause the display unit 110 to display the plurality of options.
  • the information processing server can also obtain, for example, two first options O1a and O1b, four second options O2a to O2d, and two third options O3a and O3b and cause the display unit 110 to display them. is there.
  • the second option O2 may not necessarily be at the center of the presented options.
  • the information processing server may display, for example, two first choices O1a and O1b, one second choice O2, and four third choices O3a to O3d in this order. That is, the information processing server may display the second option O2 as the third option among the seven options. Also in this case, the effect of attracting the user to the second option 2 can be expected.
  • FIG. 2 is a block diagram showing an exemplary configuration of the information processing system according to the present embodiment.
  • the information processing system according to the present embodiment includes an information processing terminal 10 and an information processing server 20.
  • the information processing terminal 10 and the information processing server 20 are connected via the network 30 so as to be able to communicate with each other.
  • the information processing terminal 10 is an information processing apparatus having a function of presenting a plurality of options to the user based on control by the information processing server 20.
  • the above options include articles, services, actions, etc. recommended to the user for achieving the goal.
  • the information processing terminal 10 according to the present embodiment may have a function of collecting various information related to the user's action.
  • the information processing terminal 10 according to the present embodiment is realized as various devices having the above functions.
  • the information processing terminal 10 according to the present embodiment may be, for example, a mobile phone, a smartphone, a tablet, a wearable device, a computer, a dedicated device of a stationary type or an autonomous moving type, or the like.
  • the information processing server 20 is an information processing apparatus having a function of controlling the presentation of options by the information processing terminal 10.
  • the category value of the second option is the category value of the first option and the category value of the third option based on the first option desired by the user. It is one of the features to obtain the second option and the third option so as to be located between the category values.
  • the network 30 has a function of connecting the information processing terminal 10 and the information processing server 20.
  • the network 30 may include the Internet, a public network such as a telephone network, a satellite communication network, various LANs (Local Area Networks) including Ethernet (registered trademark), a WAN (Wide Area Network), and the like.
  • the network 30 may include a leased line network such as an Internet Protocol-Virtual Private Network (IP-VPN).
  • IP-VPN Internet Protocol-Virtual Private Network
  • the network 30 may also include a wireless communication network such as Wi-Fi (registered trademark) or Bluetooth (registered trademark).
  • the example of the system configuration of the information processing system according to the present embodiment has been described above.
  • the configuration described above with reference to FIG. 2 is merely an example, and the configuration of the information processing system according to the present embodiment is not limited to such an example.
  • the functions of the information processing terminal 10 and the information processing server 20 according to the present embodiment may be realized by a single device.
  • the configuration of the information processing system according to the present embodiment can be flexibly deformed according to the specification and the operation.
  • FIG. 3 is a block diagram showing an example of a functional configuration of the information processing terminal 10 according to the present embodiment.
  • the information processing terminal 10 according to the present embodiment includes a display unit 110, an audio output unit 120, an audio input unit 130, an imaging unit 140, a sensor unit 150, a control unit 160, and a server communication unit 170. .
  • the display unit 110 has a function of outputting visual information such as an image or text.
  • the display unit 110 according to the present embodiment may output visual information related to the first to third options based on control by the information processing server 20, for example.
  • the display unit 110 according to the present embodiment includes a display device that presents visual information. Examples of the display device include a liquid crystal display (LCD) device, an organic light emitting diode (OLED) device, and a touch panel.
  • LCD liquid crystal display
  • OLED organic light emitting diode
  • the voice output unit 120 has a function of outputting hearing information including voice utterances and the like.
  • the audio output unit 120 according to the present embodiment may output hearing information related to the first to third options based on control by the information processing server 20.
  • the audio output unit 120 according to the present embodiment includes an audio output device such as a speaker or an amplifier.
  • the voice input unit 130 has a function of collecting sound information such as an utterance by a user and a background sound.
  • the sound information collected by the voice input unit 130 is used for voice recognition and action recognition by the information processing server 20.
  • the voice input unit 130 according to the embodiment includes a microphone for collecting sound information.
  • the imaging unit 140 has a function of capturing an image including the user and the surrounding environment.
  • the image captured by the imaging unit 140 is used for user recognition and action recognition by the information processing server 20.
  • the imaging unit 140 according to the present embodiment includes an imaging device capable of capturing an image. Note that the above image includes moving images as well as still images.
  • the sensor unit 150 has a function of collecting various sensor information related to the user's behavior.
  • the sensor information collected by the sensor unit 150 is used for action recognition by the information processing server 20.
  • the sensor unit 150 includes, for example, an acceleration sensor, a gyro sensor, a geomagnetic sensor, a heat sensor, an optical sensor, a vibration sensor, a GNSS (Global Navigation Satellite System) signal receiving device, and the like.
  • GNSS Global Navigation Satellite System
  • Control unit 160 The control part 160 which concerns on this embodiment has a function which controls each structure with which the information processing terminal 10 is provided.
  • the control unit 160 controls, for example, start and stop of each component. Further, the control unit 160 can input a control signal generated by the information processing server 20 to the display unit 110 or the sound output unit 120. Moreover, the control part 160 which concerns on this embodiment may have a function equivalent to the output control part 230 of the information processing server 20 mentioned later.
  • the server communication unit 170 has a function of performing information communication with the information processing server 20 via the network 30. Specifically, the server communication unit 170 transmits, to the information processing server 20, the sound information collected by the voice input unit 130, the image information captured by the imaging unit 140, and the sensor information collected by the sensor unit 150. The server communication unit 170 also receives, from the information processing server 20, control signals and artificial voices related to the output of the first to third options.
  • the example of the functional configuration of the information processing terminal 10 according to the present embodiment has been described above.
  • the above configuration described with reference to FIG. 3 is merely an example, and the functional configuration of the information processing terminal 10 according to the present embodiment is not limited to such an example.
  • the information processing terminal 10 according to the present embodiment may not necessarily include all of the configurations shown in FIG. 3.
  • the information processing terminal 10 can be configured not to include the imaging unit 140, the sensor unit 150, and the like.
  • the control unit 160 according to the present embodiment may have the same function as the output control unit 230 of the information processing server 20.
  • the functional configuration of the information processing terminal 10 according to the present embodiment can be flexibly deformed according to the specification and the operation.
  • FIG. 4 is a block diagram showing an example of a functional configuration of the information processing server 20 according to the present embodiment.
  • the information processing server 20 according to the present embodiment includes a recognition unit 210, a determination unit 220, an output control unit 230, a voice synthesis unit 240, a storage unit 250, and a terminal communication unit 260.
  • the storage unit 250 also includes a user DB 252, a condition setting DB 254, and an option DB 256.
  • the recognition unit 210 has a function of performing recognition related to the user.
  • the recognition unit 210 can perform user recognition, for example, by comparing the speech or image of the user collected by the information processing terminal 10 with the voice feature or image of the user stored in advance in the user DB 252. .
  • the recognition unit 210 can recognize the action of the user based on the sound information, the image, and the sensor information collected by the information processing terminal 10. For example, the recognition unit 210 can perform voice recognition based on the user's utterance collected by the information processing terminal 10, and can recognize that the user is about to eat a meal. Also, for example, the recognition unit 210 can recognize that the user is exercising based on the image and the sensor information collected by the information processing terminal 10. Also, for example, the recognition unit 210 can recognize that the user is searching for a restaurant based on the search behavior of the user. As described above, the user's action recognized by the recognition unit 210 includes a speech action and a search action as well as an action accompanied by a large body movement.
  • the determination unit 220 has a function of determining whether or not the action of the user recognized by the recognition unit 210 satisfies the multiple option presentation condition. Specifically, the determination unit 220 determines whether the action recognized by the recognition unit 210 is an action that affects the attainment of the goal. For example, when the goal is weight loss, the determination unit 220 may determine whether the recognized action is a meal or exercise-related action.
  • the determination unit 220 subsequently determines whether the category value of the action exceeds a threshold.
  • the above category value is set for each category of goal, and indicates a numerical value that is an important factor for achieving the goal.
  • the determination unit 220 may determine whether the calorie intake of the meal, which is the first option the user is about to take, exceeds a threshold. At this time, when the intake calories exceed the threshold, the determination unit 220 determines that the action of the user satisfies the presentation condition of the plurality of options.
  • the determination unit 220 may determine whether the intensity of exercise that the user is going to perform or the consumed calories exceeds a threshold. At this time, when the exercise intensity and the consumed calories fall below the threshold value, the determination unit 220 determines that the action of the user satisfies the presentation condition of the plurality of options.
  • the determination unit 220 can perform the above determination based on the category corresponding to the target stored in the setting condition DB 254 and the threshold value of the category value according to the category.
  • the output control unit 230 acquires the second option and the third option based on the category value of the first option estimated from the action of the user, and selects one of the first to third options. It has a function to control the output. At this time, the output control unit 230 according to the present embodiment causes the category value of the second option to be located between the category value of the first option and the category value of the third option. And obtaining a third option is one of the features.
  • the output control unit 230 it is more appropriate for achieving the goal than the first option desired by the user, using the tendency of the user who is easy to select the middle option. It is possible to guide the user to select the second option O2.
  • the output control unit 230 may output the plurality of options in the order of the first option, the second option, and the third option. Furthermore, when outputting a plurality of options as visual information to the display unit 110, the output control unit 230 may display the second option near the center of the display area. More specifically, the output control unit 230 may perform display control such that the second option is disposed closer to the center of the display area than the first and third options. For example, the output control unit 230 may display the first option on the left side of the display area, the second option on the vicinity of the center of the display area, and the third option on the right side of the display area . Also, for example, the output control unit 230 may display the first option in the upper part of the display area, the second option in the middle of the display area, and the third option in the lower part of the display area.
  • the first to third options according to the present embodiment may be plural. Also in this case, the output control unit 230 causes the plurality of second options to be displayed at a position closer to the center of the display area than the first and third options. For example, when there is one first option, two second options, and one third option, the output control unit 230 causes the two second options to be displayed near the center of the display area. , The first option and the third option may be displayed at the edges of the two second options.
  • the second option can be physically arranged in the middle together with the category value, and a higher induction effect can be expected.
  • the output control unit 230 performs the control as described above based on the fact that the category value of the first option satisfies the predetermined condition. Specifically, when the determination unit 220 determines that the category value of the first option satisfies the presentation condition of a plurality of options, the output control unit 230 determines whether the category value of the first option is the same as the category based on the category of the first option. The second option and the third option belonging to the category are acquired from the option DB 256.
  • the output control unit 230 has a function of controlling the output mode of the first to third options so as to improve the attraction effect of the second option to the user.
  • the above-described functions of the output control unit 230 according to the present embodiment will be separately described in detail.
  • the speech synthesis unit 240 has a function of synthesizing artificial speech output by the information processing terminal 10 based on control by the output control unit 230. At this time, the voice synthesis unit 240 synthesizes the artificial voice corresponding to the output correspondence set by the output control unit 230.
  • the storage unit 250 includes a user DB 252, a condition setting DB 254, and an option DB 256.
  • the user DB 252 stores various information related to the user.
  • the user DB 252 stores, for example, a user's face image and voice feature.
  • the user DB 252 may store information such as gender, age, preference, and tendency of the user.
  • the setting condition DB 254 stores a goal to be achieved by the user, a threshold value of a category value related to the category of the goal, and the like.
  • the above target or threshold may be set by the user, or may be set dynamically by the determination unit 220 or the like.
  • the determination unit 220 can also set weight loss, increase in muscle mass, and the like as a target based on the fact that the recognition unit 210 does not recognize the exercise a small number of times.
  • the option DB 256 associates and stores a plurality of options and a category value related to the option for each target category.
  • the options and category values stored by the option DB 256 may be dynamically accumulated based on, for example, information published on the Internet.
  • the option which concerns on this embodiment does not necessarily need to be memorize
  • the output control unit 230 according to the present embodiment may acquire, for example, the second option or the third option from another device via the network 30.
  • the terminal communication unit 260 has a function of performing information communication with the information processing terminal 10 via the network 30. Specifically, the terminal communication unit 260 receives sound information, image information, and sensor information from the information processing terminal 10. Also, the terminal communication unit 260 transmits the control signal generated by the output control unit 230 and the artificial voice synthesized by the voice synthesis unit 240 to the information processing terminal 10.
  • the functional configuration of the information processing server 20 has been described.
  • the above-mentioned functional composition explained using Drawing 4 is an example to the last, and functional composition of information processing server 20 concerning this embodiment is not limited to the example concerned.
  • the information processing server 20 may not necessarily have all of the configurations shown in FIG. 4.
  • the recognition unit 210, the determination unit 220, the voice synthesis unit 240, and the storage unit 250 can be provided in another device different from the information processing server 20.
  • the functional configuration of the information processing server 20 according to the present embodiment can be flexibly deformed according to the specification and the operation.
  • output control of a plurality of options by the output control unit 230 according to the present embodiment will be described by giving a specific example.
  • the output control unit 230 according to this embodiment outputs the output modes related to the first to third options so as to improve the attraction effect of the second option to the user among the plurality of options to be presented. Can be controlled. That is, the output control unit 230 can control the outputs of the first to third options so that the second option looks more attractive to the user than the other options.
  • the above output modes include, for example, language expressions.
  • the output control unit 230 can control the linguistic expressions of the first to third options so as to improve the attraction effect of the second option. More specifically, the output control unit 230 may generate an explanatory note for the first to third options so that the second option looks attractive to the user.
  • FIG. 5 is a diagram for describing control of language expression by the output control unit 230 according to the present embodiment.
  • the upper part of FIG. 5 shows a scene in which the user U2 has lost his choice between the $ 500 chair and the $ 300 chair when purchasing a chair.
  • the determination unit 220 determines that the action of the user U2 satisfies the presentation condition of the plurality of options. More specifically, the determination unit 220 determines that the goal of the user U2 is savings, and that the amount of at least one of the chairs for which the user U2 is considering purchasing, that is, the category value, exceeds the threshold. The above determination may be made. In the example shown in FIG. 5, based on the fact that the category value is, for example, $ 400, the determination unit 220 determines a chair of $ 500 as the first option O1.
  • the output control unit 230 acquires the second option O2 and the third option O3 based on the above determination by the determination unit 220.
  • the output control unit 230 uses the chair as the second option O2 as it is.
  • the output control unit 230 sets the $ 300 chair as the second option O2, and the $ 100 chair having a category value lower than that of the second option O2 as the third option O3. It is acquired as.
  • the output control unit 230 causes the display unit 110 of the information processing terminal 10 to output the first to third options O1 to O3 and a voice indicating that another option, that is, the third option O3 has been found.
  • the speech SO3 is output to the voice output unit 120.
  • the output control unit 230 causes the voice output unit 120 to output a voice utterance SO4 corresponding to an explanatory sentence related to the first to third options following the voice utterance SO3.
  • the output control unit 230 may generate an explanatory text for the first to third options O1 to O3 so that the second option O2 looks attractive to the user U2, and may cause the voice output unit 120 to output an explanatory text.
  • the output control unit 230 is a target person who makes a recommendation “to the smart U” and a modification relating to the second and third options O2 and O3 “of a simple design”. You may control verbal expressions such as the reason for recommendation such as the word, "It looks good”.
  • the output control unit 230 can control the language expression as described above by acquiring information regarding the age, sex, preference, and the like of the user U2 stored in the user DB 252. According to the output control unit 230 according to the present embodiment, it is possible to control the language expression according to the first to third options so that the explanatory text related to the second option looks more attractive to the user. It is possible to further enhance the attraction effect of the second option for Although FIG. 5 exemplifies a case in which the output control unit 230 outputs an explanatory text by voice for the first to third options, the output control unit 230 uses the explanatory text as visual information to display the display unit 110. It may be displayed on the screen.
  • the output modes controlled by the output control unit 230 also include visual expressions.
  • the output control unit 230 can control visual expression relating to the first to third options so as to enhance the attraction effect of the second option. More specifically, the output control unit 230 may control display of images and text corresponding to the first to third options so that the second option looks attractive to the user.
  • FIG. 6 is a diagram for describing control of visual expression by the output control unit 230 according to the present embodiment.
  • FIG. 6 shows first to third options O1 to O3 displayed on the display unit 110 under the control of the output control unit 230.
  • the output control unit 230 may obtain the images I1 to I3 representing the first to third options and cause the display unit 110 to display the images I1 to I3 as illustrated. At this time, the output control unit 230 according to the present embodiment acquires the images I1 to I3 and causes the display unit 110 to display them, for example, so that the image I2 representing the second option O2 looks more attractive to the user. Can.
  • the output control unit 230 displays an image that looks more delicious for the second option O2 and the third option O3.
  • I2 and I3 may be acquired and displayed, and for the first option O1, an image I1 that looks inferior may be acquired and displayed.
  • the horizontal lines drawn in the image I1 indicate poor appearance.
  • the output control unit 230 can acquire the image I1 based on, for example, the shape and size of the dish, the lightness, the resolution, the color, and the like of the image.
  • the output control unit 230 may improve the attraction effect of the second option O2 by controlling the display effect of the acquired images I1 to I3. Specifically, the output control unit 230 controls the display effect such that the image I2 representing the second option O2 is more attractive to the user by performing image processing on the acquired images I1 to I3. be able to.
  • the above display effects include, for example, lightness, contrast, saturation, resolution, noise and the like of the image.
  • the output control unit 230 may perform the correction regarding the parameters as described above so that the appearance of the image I2 representing the second option O2 is improved.
  • the output control unit 230 can also control parameters or add noise so that the appearance of the image I1 representing the first option O1 is degraded.
  • the visual expression according to the present embodiment includes a color expression.
  • the output control unit 230 may control the color expression of the first to third options O1 to O3 so as to improve the attraction effect of the second option.
  • the output control unit 230 may enhance the attraction effect of the second option O2, for example, by using a color that affects the appetite. it can.
  • the output control unit 230 may control the background colors BC1 to BC3 related to the images I1 to I3. For example, the output control unit 230 may improve the attractiveness of the image I2 by using a warm color which is appetizing as the background color BC2 related to the second option O2. Further, the output control unit 230 may reduce the attractiveness of the image I1 by using a cold color that reduces the appetite for the background color BC1 related to the first option O1. Further, the output control unit 230 may control the color of the text related to the first to third options O1 to O3 in addition to the background colors BC to BC3.
  • the assignment of colors described above is merely an example, and the control of color expression by the output control unit 230 is not limited to such an example.
  • the output control unit 230 can appropriately select a color that the user feels more attractive based on, for example, the season, the temperature, the preference of the user, and the like.
  • the output modes controlled by the output control unit 230 also include auditory expressions.
  • the output control unit 230 can control auditory expressions related to the first to third options so as to improve the attraction effect of the second option. More specifically, the output control unit 230 may control voice utterance and background sound so that the second option looks attractive to the user.
  • FIG. 7 is a diagram for describing control of auditory expression by the output control unit 230 according to the present embodiment.
  • first to third options O1 to O3 presented to the user U3 who is going to visit a nearby library are shown together with the map information.
  • the determination unit 220 determines that the action of the user U3 satisfies the presentation condition of the plurality of options based on the speech UO3 of the user U3 of the user recognized by the recognition unit 210. More specifically, the determination unit 220 determines that the goal of the user U3 is elimination of the lack of exercise, and walking movement to a nearby library, that is, exercise intensity and calorie consumption for the first option O1 fall below the threshold. Based on the determination, it is determined that the action of the user U3 satisfies the presentation condition of multiple options.
  • the output control unit 230 acquires the second option O2 and the third option O3 based on the above determination by the determination unit 220. At this time, the output control unit 230 causes the exercise intensity to be higher than the second option O2 having a higher exercise intensity than the first option O1 and the second option O2 so as to eliminate the lack of exercise of the user U3. A third option O3 may be obtained.
  • the output control unit 230 causes the display unit 110 to display the first to third options O1 to O3 together with the map information, and finds another option, that is, the second option O2 and the third option O3.
  • the voice output unit 120 is made to output a voice utterance SO5 informing the effect.
  • the output control unit 230 may control the display of the map information so that the second option O2 is disposed near the center of the display area as illustrated. According to the above control, the second option O2 can be displayed at a position where the user U3 can easily catch the eye, and the attraction effect of the second option O2 can be enhanced.
  • the output control unit 230 can output the background sound BS to the audio output unit 120 along with the above control.
  • the output control unit 230 selects the background sound BS in which the attraction effect of the second option O2 is enhanced and causes the sound output unit 120 to output the selected background sound BS.
  • the output control unit 230 can enhance the motivation of the user U3 with respect to exercise, for example, by outputting up-tempo music.
  • the output control unit 230 may select, as the background sound BS, a music that is frequently heard by the user U3 when jogging.
  • the output control unit 230 according to the present embodiment can enhance the attraction effect of the second option by controlling the auditory expression even if the category of the plurality of options is, for example, a meal.
  • the output control unit 230 uses a music or the like as a background sound to cause the user to select a wine that is the second option. The user may be reminded of a certain France.
  • the background sound has been described above as an example of auditory expression
  • the output control unit 230 utters the explanatory text of the first to third options
  • the speed, strength and weakness of the speech utterance The attraction effect of the second option can be enhanced by controlling the height, height, length, and the like.
  • FIG. 8 is a flowchart showing the flow of output control by the information processing server 20 according to the present embodiment.
  • the recognition unit 210 of the information processing server 20 recognizes the user and the action of the user based on the sound information, the image information and the sensor information collected by the information processing terminal 10 (S1101). .
  • the determination unit 220 determines whether the action of the user recognized in step S1101 satisfies the presentation condition of multiple options (S1102). At this time, the determination unit 220 determines whether the recognized action is an action affecting target achievement, and whether the category value of the action exceeds a threshold.
  • the information processing server 20 ends the process related to the output control of the plurality of options. If the user's action recognized in step S1101 is a voice input action related to an inquiry or the like, the information processing server 20 may perform a process corresponding to the voice utterance.
  • the output control unit 230 determines the second option and the second option based on the category value of the first option estimated from the user's action.
  • the third option is acquired (S1103).
  • the output control unit 230 controls the second option and the third option such that the category value of the second option is located between the category value of the first option and the category value of the third option.
  • the output control unit 230 sets an output mode relating to the first to third options so as to improve the attraction effect of the second option to the user (S1104). Specifically, the output control unit 230 can enhance the attraction effect of the second option by controlling the linguistic expression, the visual expression, and the auditory expression according to the first to third options.
  • the output control unit 230 causes the speech synthesis unit 240 to synthesize an artificial speech used for speech utterances related to the first to third options (S1105).
  • the process in step S1105 may not be performed.
  • the terminal communication unit 260 transmits, to the information processing terminal 10, the artificial voice synthesized in step S1105 and the control signal according to the output mode set in step S1104, and outputs the first to third options. Control is executed (S1106).
  • FIG. 9 is a block diagram illustrating an exemplary hardware configuration of the information processing terminal 10 and the information processing server 20 according to an embodiment of the present disclosure.
  • the information processing terminal 10 and the information processing server 20 include, for example, a CPU 871, a ROM 872, a RAM 873, a host bus 874, a bridge 875, an external bus 876, an interface 877, and an input device 878. , An output device 879, a storage 880, a drive 881, a connection port 882, and a communication device 883.
  • the hardware configuration shown here is an example, and some of the components may be omitted. In addition, components other than the components shown here may be further included.
  • the CPU 871 functions as, for example, an arithmetic processing unit or a control unit, and controls the overall operation or a part of each component based on various programs recorded in the ROM 872, the RAM 873, the storage 880, or the removable recording medium 901.
  • the ROM 872 is a means for storing a program read by the CPU 871, data used for an operation, and the like.
  • the RAM 873 temporarily or permanently stores, for example, a program read by the CPU 871 and various parameters appropriately changed when the program is executed.
  • the CPU 871, the ROM 872, and the RAM 873 are mutually connected via, for example, a host bus 874 capable of high-speed data transmission.
  • host bus 874 is connected to external bus 876, which has a relatively low data transmission speed, via bridge 875, for example.
  • the external bus 876 is connected to various components via an interface 877.
  • Input device 8708 For the input device 878, for example, a mouse, a keyboard, a touch panel, a button, a switch, a lever, and the like are used. Furthermore, as the input device 878, a remote controller (hereinafter, remote control) capable of transmitting a control signal using infrared rays or other radio waves may be used.
  • the input device 878 also includes a voice input device such as a microphone.
  • the output device 879 is a display device such as a CRT (Cathode Ray Tube), an LCD, or an organic EL, a speaker, an audio output device such as a headphone, a printer, a mobile phone, or a facsimile. It is a device that can be notified visually or aurally. Also, the output device 879 according to the present disclosure includes various vibration devices capable of outputting haptic stimulation.
  • the storage 880 is a device for storing various data.
  • a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like is used.
  • the drive 881 is a device that reads information recorded on a removable recording medium 901 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, or writes information on the removable recording medium 901, for example.
  • a removable recording medium 901 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory
  • the removable recording medium 901 is, for example, DVD media, Blu-ray (registered trademark) media, HD DVD media, various semiconductor storage media, and the like.
  • the removable recording medium 901 may be, for example, an IC card equipped with a non-contact IC chip, an electronic device, or the like.
  • connection port 882 is, for example, a port for connecting an externally connected device 902 such as a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface), an RS-232C port, or an optical audio terminal. is there.
  • an externally connected device 902 such as a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface), an RS-232C port, or an optical audio terminal. is there.
  • the external connection device 902 is, for example, a printer, a portable music player, a digital camera, a digital video camera, an IC recorder, or the like.
  • the communication device 883 is a communication device for connecting to a network.
  • a communication card for wired or wireless LAN Bluetooth (registered trademark) or WUSB (Wireless USB), a router for optical communication, ADSL (Asymmetric Digital) (Subscriber Line) router, or modem for various communications.
  • Bluetooth registered trademark
  • WUSB Wireless USB
  • ADSL Asymmetric Digital
  • Subscriber Line Subscriber Line
  • the information processing server 20 acquires the second option and the third option based on the category value of the first option estimated from the action of the user, and Control the output of the first to third options.
  • the information processing server 20 according to the present embodiment is configured such that the category value of the second option is located between the category value of the first option and the category value of the third option.
  • obtaining a third option is one of the features. According to such a configuration, it is possible to more naturally guide the appropriate selection by the user.
  • the information processing server 20 causes the display unit 110 to mainly output the first to third options as visual information
  • the present technology is not limited to such an example.
  • the information processing server 20 can also present the first to third options to the user using only voiced speech.
  • the information processing server 20 outputs the second option and the third option one by one.
  • the information processing server 20 includes a plurality of second options and a plurality of second options. Output control can also be performed by acquiring the third option of.
  • each step concerning processing of information processing server 20 of this specification does not necessarily need to be processed in chronological order according to the order described in the flowchart.
  • the steps related to the processing of the information processing server 20 may be processed in an order different from the order described in the flowchart or may be processed in parallel.
  • the second option and the third option are acquired based on the category value of the first option estimated from the user's behavior, and the first option, the second option, and the third option are obtained.
  • Output control unit to control the output Equipped with The output control unit is configured to set the second option so that the category value of the second option is located between the category value of the first option and the category value of the third option. Get the third option, Information processing device.
  • the output control unit outputs a plurality of options in order of the first option, the second option, and the third option.
  • the output control unit causes the second option to be displayed near the center of the display area.
  • the output control unit acquires the second option and the third option belonging to the same category as the category based on the category of the first option.
  • the information processing apparatus according to any one of the above (1) to (3).
  • the output control unit acquires the second option and the third option based on the fact that the category value of the first option satisfies a predetermined condition.
  • the information processing apparatus according to any one of the above (1) to (4).
  • the output control unit controls an output aspect according to the first option, the second option, and the third option such that the attraction effect of the second option to the user is improved.
  • the information processing apparatus according to any one of the above (1) to (5).
  • the output mode includes a visual expression
  • the output control unit controls the visual representation according to the first option, the second option, and the third option such that the attraction effect of the second option is improved.
  • the visual representation includes display effects of an image representing the first option, the second option, and the third option
  • the output control unit controls a display effect of the image representing the first option, the second option, and the third option such that the attraction effect of the second option is improved.
  • the display effect of the image includes at least one of brightness, contrast, saturation, resolution, and noise.
  • the visual expression includes a color expression
  • the output control unit controls a color expression according to the first option, the second option, and the third option such that the attraction effect of the second option is improved.
  • the information processing apparatus according to any one of the above (7) to (9).
  • the output control unit acquires an image representing the first option, the second option, and the third option such that the attraction effect of the second option is improved, and outputs the image.
  • Control The information processing apparatus according to any one of the above (6) to (10).
  • the output mode includes an auditory expression, The output control unit controls the auditory expression according to the first option, the second option, and the third option such that the attraction effect of the second option is improved.
  • the information processing apparatus according to any one of the above (6) to (11).
  • the auditory expression includes background sound, The output control unit controls the output of the background sound such that the attraction effect of the second option is improved.
  • the output mode includes a linguistic expression, The output control unit controls the language expression according to the first option, the second option, and the third option so that the attraction effect of the second option is improved.
  • the linguistic expression includes the first option, the second option, and a modifier according to the third option, The output control unit controls the first option, the second option, and the modifier according to the third option such that the attraction effect of the second option is improved.
  • a recognition unit that recognizes the action of the user, Further comprising The information processing apparatus according to any one of the above (1) to (15).
  • a display unit that displays the first option, the second option, and the third option based on control by the output control unit; Further comprising The information processing apparatus according to any one of the above (1) to (16).
  • An audio output unit that outputs a voice of the first option, the second option, and the third option based on control by the output control unit; Further comprising The information processing apparatus according to any one of the above (1) to (17).
  • the processor obtains a second option and a third option based on the category value of the first option estimated from the user's behavior, the first option, the second option, and the third option.
  • Control the output of the Including The controlling may be such that the category value of the second option is located between the category value of the first option and the category value of the third option; Obtaining the third option, Further include, Information processing method.
  • Output control unit to control the output Equipped with The output control unit is configured to set the second option so that the category value of the second option is located between the category value of the first option and the category value of the third option. Get the third option, Information processing device, Program to function as.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un dispositif de traitement d'informations qui comprend une unité de commande de sortie qui : acquiert une deuxième option et une troisième option sur la base d'une valeur de catégorie pour une première option qui a été supposée à partir du comportement d'un utilisateur ; et commande la sortie de la première option, de la deuxième option et de la troisième option. L'unité de commande de sortie : acquiert la deuxième option et la troisième option de sorte que la valeur de catégorie pour la deuxième option est entre la valeur de catégorie pour la première option et la valeur de catégorie pour la troisième option. Ledit dispositif de traitement d'informations guide plus naturellement un utilisateur vers un choix approprié.
PCT/JP2018/019663 2017-08-01 2018-05-22 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2019026396A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-148814 2017-08-01
JP2017148814 2017-08-01

Publications (1)

Publication Number Publication Date
WO2019026396A1 true WO2019026396A1 (fr) 2019-02-07

Family

ID=65233710

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/019663 WO2019026396A1 (fr) 2017-08-01 2018-05-22 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2019026396A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020140670A (ja) * 2019-03-01 2020-09-03 株式会社トゥエンティーフォーセブン 食事のリコメンド方法、リコメンドシステムおよびリコメンドプログラム

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015045629A1 (fr) * 2013-09-25 2015-04-02 日産自動車株式会社 Dispositif d'affichage d'informations de véhicule

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015045629A1 (fr) * 2013-09-25 2015-04-02 日産自動車株式会社 Dispositif d'affichage d'informations de véhicule

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
3 March 2014 (2014-03-03), pages 1 - 8 *
JANNACH, DIETMAR: "Recommender systems : an introduction, first edition", KYORITSU SHUPPAN CO., LTD, 10 December 2012 (2012-12-10), pages 243 - 262 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020140670A (ja) * 2019-03-01 2020-09-03 株式会社トゥエンティーフォーセブン 食事のリコメンド方法、リコメンドシステムおよびリコメンドプログラム

Similar Documents

Publication Publication Date Title
US20210034141A1 (en) Information processing system, client terminal, information processing method, and recording medium
JP7100092B2 (ja) ワードフロー注釈
JP2019000937A (ja) コミュニケーション装置、コミュニケーションロボットおよびコミュニケーション制御プログラム
CN113532464A (zh) 控制方法、个人认证装置和记录介质
JPWO2015178078A1 (ja) 情報処理装置、情報処理方法及びプログラム
WO2017130486A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
JP2019008570A (ja) 情報処理装置、情報処理方法及びプログラム
JPWO2016181670A1 (ja) 情報処理装置、情報処理方法及びプログラム
WO2023018908A1 (fr) Système d'intelligence artificielle conversationnelle dans un espace de réalité virtuelle
WO2019026396A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
EP3340240B1 (fr) Dispositif et procédé de traitement d'informations ainsi que programme
WO2016157678A1 (fr) Dispositif, procédé et programme de traitement d'informations
WO2019073668A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2016052501A1 (fr) Dispositif d'interface d'utilisateur, programme et procédé de notification de contenu
WO2018198447A1 (fr) Dispositif et procédé de traitement d'informations
WO2023221233A1 (fr) Appareil, système et procédé de projection en miroir interactive
WO2018168247A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2014030657A (ja) 刺激誘発装置、刺激誘発方法及びプログラム
WO2018061346A1 (fr) Dispositif de traitement d'informations
JP2021114004A (ja) 情報処理装置及び情報処理方法
JP5330005B2 (ja) デジタルフォトフレーム、情報処理システム及び制御方法
US11593426B2 (en) Information processing apparatus and information processing method
WO2020158171A1 (fr) Processeur d'informations pour sélection d'agent de réponse
US11270682B2 (en) Information processing device and information processing method for presentation of word-of-mouth information
US20240071014A1 (en) Predicting context aware policies based on shared or similar interactions

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18842011

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 18842011

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP