CN110462597B - Information processing system and storage medium - Google Patents

Information processing system and storage medium Download PDF

Info

Publication number
CN110462597B
CN110462597B CN201880022385.8A CN201880022385A CN110462597B CN 110462597 B CN110462597 B CN 110462597B CN 201880022385 A CN201880022385 A CN 201880022385A CN 110462597 B CN110462597 B CN 110462597B
Authority
CN
China
Prior art keywords
notification
user
unit
emotion
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880022385.8A
Other languages
Chinese (zh)
Other versions
CN110462597A (en
Inventor
小森显博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of CN110462597A publication Critical patent/CN110462597A/en
Application granted granted Critical
Publication of CN110462597B publication Critical patent/CN110462597B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The invention provides an information processing system and a storage medium capable of controlling execution of notification to the outside according to emotion of a user. The device is provided with: a notification unit for notifying the outside; and a control unit configured to notify the notification unit as follows: if the user is recognized as having a negative emotion before the notification by the notification unit, the notification is temporarily interrupted after the user operation for instructing the execution of the notification, and the notification to the outside is executed according to the user operation for instructing the execution of the further notification.

Description

Information processing system and storage medium
Technical Field
The present disclosure relates to information processing systems and storage media.
Background
Conventionally, information processing devices such as smartphones and PCs (personal computers) are often used to exchange messages via the internet. The exchange of text messages makes it difficult to understand the mood and idea of the partner because the partner's face cannot be seen and the voice cannot be heard, but in recent years, it is possible to insert images (still images and moving images) into text messages or send image monomers as messages, and easily transfer the mood of the partner.
Regarding a technique of transmitting emotion, for example, patent document 1 below discloses a content generating device that estimates emotion from human voice and sets the behavior of a CG character (expression, degree thereof, angle of face, nodding, and moving head) according to a preset condition.
In addition, in patent document 2 below, in an information processing apparatus that selects a video as non-voice information for each phoneme based on a feature amount of input voice information, emotion is estimated from a difference in feature amount between consecutive voices, and as the magnitude of the difference becomes larger, the degree of opening of a mouth and the degree of tilting of a head in the video are changed.
Patent document 3 below discloses an entertainment apparatus that responds to a character in real time by inputting sound from an actor, wherein the change in the pitch and volume of the input sound (for a word) is grasped, the word is evaluated based on the difference from reference data, and the content of the action of the character is determined based on the evaluation.
Patent document 1 Japanese patent laid-open No. 2008-217747
Patent document 2 International publication No. 2010/047027
Patent document 3 Japanese patent laid-open No. 2002-136764
However, the above-described techniques, which reflect the emotion of the input person more accurately, do not take into consideration that the negative emotion is directly transmitted to the other party.
If a message is informed in a state where the face of the other party cannot be seen, if the other party is in a state where the other party has negative emotion, disputes on interpersonal relationship or the user himself/herself may develop.
Disclosure of Invention
Accordingly, in the present disclosure, an information processing system and a storage medium capable of controlling execution of notification to the outside according to the emotion of a user are proposed.
According to the present disclosure, there is provided an information processing system including: a notification unit for notifying the outside; and a control unit that controls the notification unit as follows: if the user is recognized as having a negative emotion before the notification by the notification unit, the notification is temporarily interrupted after the user operation for instructing the execution of the notification, and the notification to the outside is executed according to the user operation for instructing the execution of the further notification.
According to the present disclosure, there is provided a storage medium having a program recorded thereon, the program causing a computer to function as: a notification unit for notifying the outside; and a control unit configured to control the notification unit in such a manner that, when the user is recognized as having a negative emotion before the notification by the notification unit, the notification unit temporarily interrupts the execution of the notification after the user operation for instructing the execution of the notification is performed, and performs the notification to the outside based on the user operation for instructing the execution of the further notification.
As described above, according to the present disclosure, the execution of notification to the outside can be controlled according to the emotion of the user.
Further, the above-described effects are not necessarily limited. Any one of the effects shown in the present specification may be implemented together with or in place of the above-described effects or other effects that can be appreciated from the present specification.
Drawings
Fig. 1 is a diagram illustrating an outline of an information processing system according to an embodiment of the present disclosure.
Fig. 2 is a block diagram showing an example of the structure of a client terminal according to the present embodiment.
Fig. 3 is a diagram of an example of a conversion table of the language "generating gas" according to the present embodiment.
Fig. 4 is a diagram showing an example of a conversion table of pictograms of "gas generation" according to the present embodiment.
Fig. 5 is a diagram showing an example of the overall configuration of the information processing system according to the present embodiment.
Fig. 6 is a block diagram showing an example of the structure of a server according to the present embodiment.
Fig. 7 is a flowchart showing an operation process according to the first embodiment.
Fig. 8 is a diagram showing an example of a conversion candidate presentation screen according to the first embodiment.
Fig. 9 is a block diagram showing an example of the structure of a client terminal according to the second embodiment.
Fig. 10 is a flowchart showing an operation process according to the second embodiment.
Fig. 11 is a flowchart showing an operation process of the speaker control according to the second embodiment.
Fig. 12 is a diagram illustrating a main structure of a client terminal included in the information processing system according to the second embodiment.
Fig. 13 is a flowchart showing an operation process according to the third embodiment.
Fig. 14 is a diagram showing an example of a presentation screen according to the third embodiment.
Fig. 15 is a flowchart showing an operation process according to the fourth embodiment.
Detailed Description
Preferred embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. In the present specification and drawings, constituent elements having substantially the same functional constitution are denoted by the same reference numerals, and overlapping description thereof is omitted.
The following procedure is described.
1. Summary of an information processing system according to one embodiment of the present disclosure
2. Structure of the
2-1. Structure of client terminal 1
2-2. Other structural examples
3. Various embodiments
3-1. First embodiment
3-2. Second embodiment
3-3. Third embodiment
3-4. Fourth embodiment
4. Summary
Summary of an information processing System according to one embodiment of the present disclosure
Fig. 1 is a diagram illustrating an outline of an information processing system according to an embodiment of the present disclosure. As shown in fig. 1, the information processing system according to the present embodiment is capable of recognizing the emotion of a user when the user performs notification to the outside, and controlling the execution of the notification according to the recognized emotion of the user.
That is, in the information processing system according to the present embodiment, for example, in a state in which the user holds a negative emotion, when a message that directly expresses the negative emotion is to be transmitted from the client terminal 1, it is proposed to temporarily interrupt the execution of the notification, confirm whether the transmission of the notification can be actually executed, or shift to mild expression.
Here, in the present specification, "negative emotion" means an emotion opposite to a positive emotion, for example, emotion mainly "vital energy" and "hate" among basic six emotions (happiness, sadness, vital energy, hate, surprise, fear) is equivalent. In addition, the emotion types are not limited to six, and in the case of further detailed classification, for example, "aversion", "anger", "attack", "counter-restriction", "bitter taste", "fidgetiness" and "anger" are equivalent to negative emotion.
In the example shown in fig. 1, when the user is angry, a message of an attack is to be transmitted, and "is this good performance? "this confirmed message together prompts the conversion of the message input by the user into conversion candidates for mild manifestation. The emotion of the user is recognized by, for example, capturing a face image of the user by the camera 13 provided at the client terminal 1 and analyzing the display from the face image.
This can avoid the user from notifying the outside of a negative message in a state of feeling, and as a result, smooth social communication can be expected.
In the above, an information processing system according to an embodiment of the present disclosure is described. Next, a specific configuration of an information processing system according to the present embodiment will be described with reference to the drawings.
Structure 2
< 2-1. Structure of client terminal 1 >)
First, as an example, a structure of a client terminal 1 that executes the information processing system according to the present embodiment will be described with reference to fig. 2. The client terminal 1 may be implemented by a smart phone, a mobile phone terminal, a tablet terminal, a PC (personal computer), a wearable device, a game device, or the like.
Fig. 2 is a block diagram showing an example of the structure of the client terminal 1 according to the present embodiment. As shown in fig. 2, the client terminal 1 includes a control unit 10, a communication unit 11, an operation input unit 12, a camera 13, a sensor 14, a sound input unit 15, a display unit 16, a sound output unit 17, and a storage unit 18.
The control unit 10 functions as an arithmetic processing device and a control device, and controls the overall operation in the client terminal 1 according to various programs. The control unit 10 is implemented by an electronic circuit such as a CPU (Central Processing Unit: central processing unit) or a microprocessor. The control unit 10 may include a ROM (Read Only Memory) storing a program to be used, calculation parameters, and the like, and a RAM (Random Access Memory: random access Memory) temporarily storing parameters and the like which are appropriately changed.
The control unit 10 according to the present embodiment also functions as an emotion recognition unit 101, a presentation conversion unit 102, a display control unit 103, and a message transmission execution control unit 104.
The emotion recognition section 101 recognizes the emotion of the user based on the face image of the user captured by the camera 13, the sensor information sensed by the sensor 14, and the sound input through the sound input section 15. More specifically, for example, the emotion recognition section 101 may recognize the emotion of the user based on analysis of expression, sound suppression, sound color, content of a pair of voice (including a sigh, a mouth, and the like), biological information (heart beat number information per minute, pulse information, sweating information, body temperature information, brain wave information, myoelectricity information, and the like), motion information (acceleration sensor data, gyroscope sensor data, geomagnetic sensor data, and the like). The emotion recognition section 101 may recognize the emotion of the user in consideration of the context of the message exchanged between the user and the counterpart immediately before. The algorithm for emotion recognition is not particularly limited, but may be, for example, estimating emotion from facial expressions (smiling face, normal face, happy face), estimating emotion from characteristic amounts of sounds, or estimating emotion from characteristic amounts and patterns of biological information and exercise information. The emotion recognition algorithm may be stored in the storage unit 18 in advance, or may be generated by device learning.
The expression conversion unit 102 converts the expression of the message to the outside, which the user instructs to input or transmit the execution, when the user is in a specific emotion. For example, the expression conversion unit 102 refers to a conversion table stored in advance in the storage unit 18 to convert the expression. Here, fig. 3 shows an example of a conversion table. Fig. 3 is a diagram showing an example of a conversion table in the language of "vital energy". In the conversion table shown in fig. 3, a language that is expressed so that the input language of the generated gas becomes slightly gentle is registered as a converted language. In the example shown in fig. 3, there is only one corresponding converted language, but the present embodiment is not limited to this, and a plurality of languages may be registered as converted languages in advance. The transformed language shown in fig. 3 is an example, and the present embodiment is not limited thereto.
The expression conversion is not limited to text, and may be performed on an image (so-called pictogram) showing expression. Here, fig. 4 shows an example of a conversion table of pictograms. Fig. 4 is a diagram showing an example of a conversion table of pictograms of "gas". As shown in fig. 4, the inputted pictogram which is slightly gentle is registered as the converted pictogram.
The display control unit 103 controls display on the display unit 16. The display control unit 103 according to the present embodiment presents the converted expression as a conversion candidate by the expression conversion unit 102 to the user, or presents the user with a question of whether or not the user has actually sent an external message indicating the input or the execution of the input when a specific emotion (for example, "happy") is indicated.
The message transmission execution control section 104 performs control so as to transmit the inputted message to the outside through the communication section 11 based on the user instruction. In addition, the message transmission execution control unit 104 according to the present embodiment performs control as follows: in the case where the user instructs input or transmission execution at a specific emotion (for example, "angry"), transmission is temporarily interrupted, and transmission is performed in the case where there is a further transmission execution instruction from the user.
The communication unit 11 transmits and receives data to and from an external device (for example, a peripheral device, a router, a base station, a server, etc.) via a wire or wirelessly. The communication unit 11 transmits and receives external data via, for example, a wired/wireless LAN (Local Area Network: local area network), wi-Fi (registered trademark), a mobile communication network (LTE (Long Term Evolution: long term evolution), 3G (third generation mobile communication system)), or the like.
The operation input unit 12 receives an operation instruction from a user, and outputs the operation content to the control unit 10. The operation input unit 12 may be a touch sensor, a pressure sensor, or a noncontact sensor provided integrally with the display unit 16. Alternatively, the operation input unit 12 may be physically provided separately from the display unit 16, such as a button, a switch, and a handle.
The camera 13 includes a lens system including a photographing lens, an aperture, a zoom lens, a focusing lens, and the like, a driving system for performing a focusing operation and a zooming operation on the lens system, a solid-state imaging element array for generating a photographing signal by photoelectrically converting photographing light obtained by the lens system, and the like. The solid-state imaging element array is realized by, for example, a CCD (Charge Coupled Device: charge coupled device) sensor array, a CMOS (Complementary Metal Oxide Semiconductor: complementary metal oxide semiconductor) sensor array. The camera 13 according to the present embodiment is provided at a position (for example, near the display portion 16) where the face of the user who operates the client terminal 1 can be photographed.
The sensor 14 is a sensing unit that senses a user condition, and outputs sensing information to the control unit 10. The sensor 14 may be a plurality of sensor clusters, or a plurality of types of sensors. Examples of the sensor 14 include a motion sensor (an acceleration sensor, a gyro sensor, a geomagnetic sensor, and the like), a position sensor (indoor position location based on communication with Wi-Fi (registered trademark), bluetooth (registered trademark), or outdoor position location using GPS or the like), a biological sensor (a heart beat number sensor, a pulse sensor, a perspiration sensor, a body temperature sensor, a brain wave sensor, a myoelectric sensor, and the like) and an environmental sensor (a temperature sensor, a humidity sensor, an illuminance sensor, a rain sensor, and the like).
The audio input unit 15 is implemented by a microphone, a microphone amplifying unit that amplifies an audio signal obtained by the microphone, and an a/D converter that digitally converts the audio signal, and outputs the audio signal to the control unit 10.
The display unit 16 is a display device that outputs various operation input screens. The display unit 16 may be, for example, a liquid crystal display (LCD: liquid Crystal Display), an organic EL (Electroluminescence) display, or the like.
The audio output unit 17 includes a speaker for reproducing an audio signal and an amplifying circuit for the speaker.
The storage unit 18 is implemented by storing ROM (Read Only Memory) of programs and calculation parameters for the processing of the control unit 10, and RAM (Random Access Memory) of temporarily storing parameters that change appropriately. In addition, the storage unit 18 according to the present embodiment stores a emotion recognition algorithm, a conversion table of expression, and the like.
The structure of the client terminal 1 according to the present embodiment is described above in detail. The configuration of the client terminal 1 is not limited to the example shown in fig. 3, and for example, the camera 13 or the sensor 14 may be provided in an external device (a peripheral device such as a wearable device) and the captured image or the sensor data may be transmitted to the client terminal 1 by wireless or wired.
< 2-2. Other structural examples >
In the above example, the case where the information processing system according to the present embodiment is executed by the client terminal 1 has been described, but the present disclosure is not limited to this, and the main processing may be performed on the server 2 side by a system configuration including the client terminal 1 and the server 2. Hereinafter, description will be made with reference to fig. 5 and 6.
Fig. 5 is a diagram showing an example of the overall configuration of the information processing system according to the present embodiment. As shown in fig. 5, the information processing system according to the present embodiment includes a client terminal 1 and a server 2.
The client terminals 1 (1A, 1B) and the server 2 are connected to each other via a network 3 so as to transmit and receive data.
The server 2 can perform recognition of the emotion of the user, conversion of the inputted expression, and transmission execution control of the message based on the information transmitted from the client terminal 1. The specific configuration of the server 2 will be described with reference to fig. 6.
For example, when the user a of the client terminal 1A and the user B of the client terminal 1B exchange messages, the server 2 recognizes the emotion of the user a or the user B, and controls the execution of an externally transmitted message, which the user instructs to input or transmit when the user has a specific emotion (for example, "angry").
(Structure of Server 2)
Next, the structure of the server 2 will be described with reference to fig. 6. Fig. 6 is a block diagram showing an example of the structure of a server according to the present embodiment. As shown in fig. 6, the server 2 includes a control unit 20, a communication unit 21, and a storage unit 22.
(control section 20)
The control unit 20 functions as an arithmetic processing device and a control device, and controls the overall operation in the server 2 according to various programs. The control unit 20 is realized by an electronic circuit such as a CPU (Central Processing Unit) or microprocessor. The control unit 20 may include a memory ROM (Read Only Memory) for storing a program and calculation parameters to be used, and a memory RAM (Random Access Memory) for temporarily storing parameters to be changed appropriately.
The control unit 20 according to the present embodiment also functions as an emotion recognition unit 201, a performance conversion unit 202, an output control unit 203, and a message transmission execution control unit 204.
The emotion recognition section 201 recognizes the emotion of the user based on at least any one of the face captured image, sensor information (heart beat number information per minute, pulse information, sweating information, body temperature information, brain wave information, myoelectricity information, acceleration sensor data, gyroscope sensor data, geomagnetic sensor data, etc.) or sound information of the user transmitted from the client terminal 1, as in the emotion recognition section 101 described above.
The expression conversion unit 202 converts the expression of the message to the outside, which the user instructs to input or transmit the execution, in the specific emotion, similarly to the expression conversion unit 102 described above.
The output control unit 203 generates (outputs) display information to be displayed on the display unit 16 of the client terminal 1, and transmits the display information from the communication unit 21 to the client terminal 1. The output control unit 203 according to the present embodiment generates, for example, a screen for presenting the representation converted by the representation conversion unit 202 as conversion candidates to the user, a screen for presenting whether or not the user has actually transmitted an inquiry indicating that an external message is input or transmitted when the user is in a specific emotion (for example, "happy"), and transmits the inquiry to the client terminal 1.
The message transmission execution control unit 204 controls the execution of the transmission of the message to the transmission destination, which is input to the client terminal 1. The message transmission execution control unit 204 according to the present embodiment controls, for example, when the user a of the client terminal 1A instructs input or transmission execution in a specific emotion (for example, "gas"), to temporarily interrupt transmission of a message to the client terminal 1B as a transmission destination, and to transmit when a further transmission execution instruction is given from the user a.
(communication section 21)
The communication unit 21 is connected to an external device by wire or wireless, and transmits and receives data. The communication unit 21 is connected to an external device by wired/wireless LAN (Local Area Network), wi-Fi (Wireless Fidelity, registered trademark), or the like, for example.
(storage section 22)
The storage unit 22 is implemented by a ROM that stores programs, calculation parameters, and the like for the processing of the control unit 20, and a RAM that temporarily stores parameters, and the like that are appropriately changed. For example, the storage unit 22 according to the present embodiment stores an emotion recognition algorithm used by the emotion recognition unit 101, a conversion table representing the expression used by the conversion unit 102, and the like.
The structure of the server 2 according to the present embodiment is described above in detail. The configuration shown in fig. 3 and 4 is an example, and the present embodiment is not limited to this. For example, at least a part of the server 2 may be provided in an external device, and at least a part of the functions of the control unit 20 may be realized by the client terminal 1 or a communication device (so-called edge server or the like) located at a place where the communication distance is relatively close to the client terminal 1. In addition, a part of the recognition processing in the emotion recognition section 201 of the server 2 may be performed on the client terminal 1 side, and the expression analysis result (feature amount of expression), the sound feature amount, the feature amount of various sensor data, and the pattern may be transmitted to the server 2. By appropriately distributing the configuration and functions of the server 2 in this way, it is possible to ensure improvement of real-time performance, reduction of processing load, and safety.
Embodiments 3
Next, embodiments of the present disclosure will be specifically described with reference to the drawings.
< 3-1. First embodiment >
First, a first embodiment will be described with reference to fig. 7 to 8. In the embodiment of fig. 1, a case will be described in which a message of negative expression is to be transmitted to the opposite user in a negative emotion in exchange of a message using the client terminal 1.
Fig. 7 is a flowchart showing an operation process according to the first embodiment. As shown in fig. 7, first, when a user inputs a character string (message) in the client terminal 1 (step S103), and performs a transmission operation (step S106), the emotion recognition section 101 recognizes the emotion of the user (step S109). Further, the emotion recognition may be performed continuously or intermittently during the operation of the client terminal 1 by the user, and the emotion recognition portion 101 acquires the recognition result of the emotion of the user at the time when the user performs the transmission operation (at least immediately before the message transmission (notification)).
Next, when the emotion of "happy" is detected as the emotion of the user (step S112/yes), the expression conversion section 102 recognizes a word or pictogram representing the emotion from the character string to be transmitted by the user (step S115). Words or pictograms representing emotion can also be identified using dictionary data stored in the storage unit 18.
Next, when a word or pictogram representing the emotion of "happy" is detected in the character string (step S118/yes), the expression conversion unit 102 refers to the conversion table, and converts the word or pictogram representing happy into a soft expression (step S121). At this time, the message transmission execution control unit 104 temporarily interrupts the transmission execution of the message.
Next, the display control unit 103 presents the conversion candidates converted by the representation conversion unit 102 to the user (step S124). Here, fig. 8 shows an example of a conversion candidate presentation screen. As shown in the left side of fig. 8, for example, in the case of using a messenger that exchanges messages, the messages being exchanged are displayed in time series on the display unit 16, and the user can send a message by inputting a character string to the message input field 161 and touching the send button. When the transmission button is touched (i.e., when the user performs a transmission operation), the emotion recognition section 101 of the client terminal 1 recognizes the emotion of the user. When the user is a negative emotion (here, a "happy emotion"), and the character string further inputted is a word indicating a "happy emotion," a conversion candidate presentation image 162 for converting the word into a mild expression is displayed as shown on the right side of fig. 8. The conversion candidate presentation image 162 is displayed "a representation which is detected as inappropriate in the message". The correction candidates are presented. Is the "mixed account" transmitted after being modified to "paste? "promotion of these transformations, yes/no button". The user can thereby reconsider the message of the expression of "vital energy" to be transmitted in the emotion of "vital energy" of one time, and finally confirm whether or not it is actually transmitted. The user touches the "yes" button when the user changes the presentation and transmits it as proposed, and touches the "no" button when the user wants to transmit it as it is without changing the presentation. The display screen of the conversion candidates is not limited to the example shown in fig. 8, and may be, for example, a screen in which a plurality of conversion candidates are displayed. In this case, the user selects a conversion candidate from a plurality of conversion candidate screens.
Next, when the user selects the conversion candidate (in the example shown in fig. 8, when the "yes" button is touched) (step S127/yes), the message transmission execution control unit 104 transmits the converted character string to the other party (step S130).
On the other hand, when the user does not select the conversion candidate (in the example shown in fig. 8, when the "no" button is touched) (step S127/no), the message transmission execution control unit 104 transmits the input character string to the other party as it is (step S133). In addition, when the emotion of "happy" is not detected as the emotion of the user (step S112/no), and when the word or pictogram indicating the emotion of "happy" is not detected in the character string even if the emotion of "happy" is detected (step S118/no), the message transmission execution control unit 104 transmits the inputted character string to the other party as it is (step S133).
The operation processing according to the present embodiment is described above. Further, a "correction" button may be provided so that the message before transmission can be corrected by itself without selecting the conversion candidates.
Further, the presentation may be automatically changed and transmitted without confirmation by the user.
< 3-2. Second embodiment >
Next, a second embodiment will be described with reference to fig. 9 to 11. In the above-described embodiment, the case where a message (text, image) is output as a notification to the outside has been described, but the present disclosure is not limited to this, and for example, the notification of a horn sound is similarly controlled according to the emotion of the user.
(Structure)
Fig. 9 is a block diagram showing an example of the structure of the vehicle-mounted client terminal 1x according to the second embodiment. The vehicle-mounted client terminal 1x is connected to a vehicle system of an automobile by wire/wireless, and mainly performs output control of a horn sound.
As shown in fig. 9, the client terminal 1x includes a control unit 10x, a communication unit 11x, a speaker operation detection unit 19, a camera 13, a sensor 14, a sound input unit 15, a sound output unit 17, and a storage unit 18.
The control unit 10x, the arithmetic processing device, and the control device function to control the overall operation in the vehicle-mounted client terminal 1 according to various programs. The control unit 10 is realized by an electronic circuit such as a CPU (Central Processing Unit) or microprocessor. The control unit 10 may include a memory ROM (Read Only Memory) for storing a program and calculation parameters to be used, and a memory RAM (Random Access Memory) for temporarily storing parameters to be changed appropriately.
The control unit 10x according to the present embodiment also functions as the emotion recognition unit 101, the horn conversion unit 105, and the horn output control unit 106.
The emotion recognition section 101 recognizes the emotion of the user (here, the driver) as in the first embodiment.
The horn sound conversion unit 105 may convert the sound of the horn according to the emotion of the user when the horn is operated. For example, if the emotion of the user is a negative emotion such as "live air", it is estimated that the user will sound a horn for the purpose of threatening or attacking the partner, and therefore, the user is converted into a horn sound with a sound quality that is milder than the standard (for example, the sound volume may be reduced only). In addition, when the emotion of the user is a positive emotion such as "happy", it is estimated that the emotion of the user about the thank, greeting, or thank to the other party is expressed by a horn, and therefore, the emotion is converted into a bright sound quality (or a sound quality) horn. More specifically, the conversion of the horn sound is performed by controlling the parameters of the horn sound. For example, when the parameters for controlling the horn are set according to the emotion, the horn converting unit 105 controls the parameters (default) of the standard horn to be converted into parameters corresponding to the identified emotion.
The horn output control unit 106 performs horn output control. For example, the horn output control unit 106 is configured to output a horn when the horn operation detection unit 19 detects a horn operation (for example, a press of a horn switch) by a user. The horn output control unit 106 according to the present embodiment controls the horn output control unit to temporarily interrupt the output of the horn when the user performs the horn operation in a specific emotion (for example, "angry") and to output the horn when a further output instruction is given from the user.
The communication unit 11x has an external communication function for transmitting and receiving data to and from an external device such as a server, and an internal communication function for transmitting and receiving data to and from an in-vehicle network. The in-vehicle network is a communication network connected to each control unit (drive system control unit, vehicle body system control unit, battery control unit, outside-vehicle information detection unit, inside-vehicle information detection unit, etc.) in the vehicle. The pressing information (horn operation information) from the horn switch (horn operating section) and the control (horn output control) for the horn section (reporting section) can be performed via the in-vehicle network.
The camera 13, the sensor 14, the sound input unit 15, the sound output unit 17, and the storage unit 18 are the same as those of the first embodiment. The camera 13 captures a face image of a user as a driver, and the sensor 14 is various sensors that sense a condition of the user as a driver. Further, the sound output section 17 according to the present embodiment outputs sound mainly to a user in the vehicle here. The sound output unit 17 may reproduce the horn sound in the vehicle as a notification to the user under the control of the horn sound output control unit 106.
The speaker operation detection unit 19 detects a speaker operation performed by a user. For example, the horn operation detection unit 19 acquires and detects, via an in-vehicle network, information on the depression of a horn switch (an example of a horn operation unit) provided in the center of the handle.
The configuration of the vehicle-mounted client terminal 1x according to the present embodiment is described above. Further, the structure of the client terminal 1x is not limited to the example shown in fig. 9. For example, at least one of the camera 13, the sensor 14, the audio input unit 15, and the audio output unit 17 may be an external device provided in the vehicle, and the client terminal 1x may transmit and receive data via the vehicle-mounted network. For example, in the case where the client terminal 1x is a smart phone, a tablet terminal, a wearable device, or a removable in-vehicle terminal, the camera 13 may be a camera provided in a vehicle, the sensor 14 may be a handle, a lever, or a vehicle body (outside the vehicle, inside the vehicle), and the sound input unit 15 and the sound output unit 17 may be provided in the vehicle.
(action processing)
Next, an operation process according to the present embodiment will be described with reference to fig. 10. Fig. 10 is a flowchart showing an operation process according to the second embodiment.
As shown in fig. 10, first, the client terminal 1x continuously recognizes the emotion of the user in driving by the emotion recognition section 101 (step S203).
Next, when the speaker operation detection unit 19 detects that the user has performed a speaker operation (e.g., pressed a speaker button) (step S206), the control unit 10x determines whether or not the emotion of the user at the time of the speaker operation is an emotion of "angry" (step S209).
Next, when the emotion of the user is the emotion of "happy" (step S209/yes), the horn sound output control unit 106 reproduces the horn sound only in the vehicle through the sound output unit 17 according to the user operation (step S212). During this phase, no report is made to the outside of the vehicle. That is, the horn output control unit 106 temporarily interrupts the execution of the horn output to the outside of the vehicle in accordance with the user's horn operation. Thus, by sounding a horn in the vehicle, an effect of cooling a user with emotion can be expected. However, if the user wants to perform the release of the horn sound to the outside, for example, the user performs a further press or long press of the horn button.
Next, when the user presses the horn button more strongly (or presses the horn button for a long time or the like) (step S215/yes), the control unit 10x controls the horn output control unit 106 to reproduce the horn outside the vehicle by converting the sound quality (parameter) to a sound quality (parameter) that is milder than the standard by the horn conversion unit 105 (step S218).
On the other hand, when the user stops the pressing of the horn button (step S215/no), no off-vehicle report of the horn sound is made.
If the emotion of "happy" is detected (step S221/no) instead of the emotion of "happy" (step S209/no), the control unit 10x controls the horn sound conversion unit 105 to convert the emotion into a sound quality (parameter) brighter than the standard, and the horn sound output control unit 106 reproduces the horn sound outside the vehicle (step S227).
If neither the "happy" nor the "happy" emotion is detected (step S209/no, step S221/no), it can be estimated that the user as the driver is in a normal state, and the speaker sound is to be determined and reported in a calm state, so that the control unit 10x controls the speaker sound output control unit 106 to reproduce the speaker sound outside the vehicle with standard parameters (without conversion) (step S224).
As an example of the notification to the outside, the control corresponding to the emotion of the user of the horn is specifically described above. In addition, the speaker control according to the present embodiment can be performed in consideration of safety and without danger. The risk determination may be appropriately performed based on, for example, the departure, the inter-vehicle distance, the speed of the vehicle, the surrounding situation, and the like.
The speaker control according to the embodiment may be performed in a specific region such as a luxury residence, taking into consideration the current position of the user. Because it is not preferable to sound a horn at the expense of a home area at the expense of a feeling of liveliness.
Fig. 11 is a flowchart showing an operation process of the speaker control according to the second embodiment.
Steps S203 to S215 shown in fig. 11 are the same as those described with reference to fig. 10, and therefore detailed description thereof will be omitted here.
Next, when the user wishes to release the horn sound to the outside and presses the horn button more strongly, regardless of the reproduction of the horn sound into the vehicle (step S215/yes), the control unit 10x acquires the current position information from the sensor 14 (step S216), and determines whether or not the vehicle is currently a specific area such as a luxury residence area (step S217).
Next, when the vehicle is in the specific region (step S217/yes), the control unit 10x controls the horn conversion unit 105 to convert the sound quality (parameter) to a sound quality (parameter) that is milder than the standard, and the horn output control unit 106 to reproduce the horn outside the vehicle (step S218).
On the other hand, if the vehicle is not in the specific region (step S217/no), the control unit 10x controls the horn output control unit 106 to reproduce the horn outside the vehicle with the standard parameters (without conversion) (step S224).
< 3-3. Third embodiment >
Next, a third embodiment will be described with reference to fig. 12 to 14. In the present embodiment, the conversion processing of the message expression is performed in consideration of the emotion of the user of the destination party in addition to the emotion of the user who will transmit the message.
Fig. 12 is a diagram illustrating a main configuration of the client terminals 1A and 1B included in the information processing system according to the present embodiment. For example, when the user a is about to transmit a message from the client terminal 1A to the client terminal 1B of the user B, the client terminal 1A acquires the recognition result of the emotion of the user B recognized by the emotion recognition section 101B of the client terminal 1B. The client terminal 1A notifies the user a of the received emotion of the user B and inquires of the user whether the message is transmitted as it is. This makes it possible to expect the emotional user a to become calm.
Fig. 13 is a flowchart showing an operation process according to the third embodiment. The processing of steps S303 to S315 shown in fig. 13 is the same as the processing of steps S103 to S118 shown in fig. 7.
When the emotion of the user at the time of the message to be transmitted is "happy", and a word or pictogram representing the emotion of "happy" is detected in the character string of the inputted message (step S315/yes), the client terminal 1 acquires the emotion of the other party (step S318).
Next, when the emotion of the other party is "frustration", for example (step S321/yes), the display control unit 103 presents the user with the frustration of the other party by means of an icon or the like (step S324).
In addition, although the other party is frustrated, the display control unit 103 inquires of the user whether or not the message can be transmitted (step S327). Here, fig. 14 shows an example of a presentation screen according to the third embodiment. As shown in fig. 14, the prompt screen 164 displays "the opponent is falling down". Is transmitted in a state where the performance is maintained? "such sentence, and" yes/no "button. The user can thereby know the condition of the other party and can reconsider sending the message "vital" in a state of emotion. Here, the case of detecting "depression" is taken as an example, but the embodiment is not limited to this, and the case of detecting "sadness", "surprise", or "fear" may be taken as well.
Next, in the case where the inquiry is acknowledged (in the example shown in fig. 14, in the case where the inquiry "is transmitted in the state where the expression is maintained" is that the "yes" button is touched) (step S330/yes), the message transmission execution control unit 104 controls to transmit the inputted character string to the other party as it is (step S333). In addition, as in the first embodiment, the conversion candidates may be presented together, and when the user performs an instruction to transmit the conversion, the converted message may be transmitted to the other party.
On the other hand, if not acknowledged (in the example shown in fig. 14, when the "no" button is touched for the query "is transmitted in a state where the expression is maintained") (step S330/no), the process returns to step S303, and the character string can be input (corrected).
< 3-4. Fourth embodiment >
Next, a fourth embodiment will be described with reference to fig. 15. In the present embodiment, the message transmission device further has a function of translating a message transmitted to the other party.
Fig. 15 is a flowchart showing an operation process according to the fourth embodiment. The processing of steps S103 to S127 shown in fig. 15 is the same as the processing of the reference numerals shown in fig. 7.
When the emotion of the user at the time of sending the message is "happy", and when the word or pictogram representing the emotion of "happy" is detected in the character string of the inputted message, the client terminal 1 presents the conversion candidate, and when the conversion candidate is selected by the user (step S127/yes), the message sending execution control unit 104 translates the converted character string and sends it to the other party (step S131).
On the other hand, when the user does not select the conversion candidate (step S127/no), the message transmission execution control unit 104 translates the input character string as it is and transmits it to the other party (step S134).
Summary 5
As described above, in the information processing system according to the embodiment of the present disclosure, the execution of notification to the outside can be controlled according to the emotion of the user.
Specifically, it is possible to recognize the emotion of the user at the time of notification (message (text, image), speaker (sound), etc.) to the counterpart user, and if the emotion of the user is negative, control is made to moderate the notification (notification showing negative emotion), and if the emotion of the user is positive, control is made to emphasize the notification (notification showing positive emotion).
This can prevent deterioration of the relationship and remorse of the relationship due to the fact that the opposite party is informed of the negative emotion when the opposite party is in the negative emotion. In addition, when the opponent holds a positive emotion, the relationship can be improved by further giving notification of the positive emotion to the opponent and transmitting the notification to the opponent.
The preferred embodiments of the present disclosure have been described in detail above with reference to the drawings, but the present technology is not limited to the examples described above. It should be understood that various modifications and corrections can be made by those having ordinary knowledge in the art of the present disclosure, and these naturally fall within the technical scope of the present disclosure, while keeping within the scope of the technical ideas described in the claims.
For example, a computer program for causing the functions of the client terminal 1 or the server 2 to function may be created in hardware such as the CPU, ROM, and RAM incorporated in the client terminal 1 or the server 2. In addition, a computer-readable storage medium storing the computer program is also provided.
Further, although the description has been made with the use of pictograms other than texts for messages, the present embodiment is not limited to this, and may be applied to the case of displaying symbols representing the emotion of a user. For example, in a system in which a display formed of electronic paper or the like is provided on the outside of a vehicle and a symbol indicating the emotion of a driver can be displayed, it is desirable to improve the relationship with the surroundings when a positive emotion is given out, but a negative emotion may cause a wind wave to be given out as it is. Therefore, when the control process according to the present embodiment is applied, for example, when the symbol of "vital energy" is to be displayed in a state where the emotion of "vital energy" is maintained, the display can be temporarily interrupted, and it can be confirmed to the user whether or not the content is possible. In addition, when the symbol of "vital energy" is to be displayed while maintaining the emotion of "vital energy", the symbol of "vital energy" may be output as a slightly gentle symbol.
The effects described in the present specification are merely illustrative or exemplary, and are not limited thereto. In other words, the technology of the present disclosure may exert other effects that are clear to those skilled in the art from the description of the present specification in addition to or instead of the above-described effects.
The present technology can also adopt the following configuration.
(1)
An information processing system is provided with:
a notification unit for notifying the outside; and
a control unit that controls the notification unit as follows: if the user is recognized as having a negative emotion before the notification by the notification unit, the notification is temporarily interrupted after the user operation for instructing the execution of the notification, and the notification to the outside is executed according to the user operation for instructing the execution of the further notification.
(2) The information processing system according to the above (1), wherein,
the control unit controls the notification unit as follows: when the user is recognized as having a negative emotion before the notification is performed by the notification unit, the notification is performed after the user operation for instructing the notification is temporarily interrupted, and the notification is changed to a mild presentation based on a further user operation.
(3) The information processing system according to the above (2), wherein,
the control unit controls the notification unit to interpret the content of the notification of the change to the mild expression and then notify the content.
(4) The information processing system according to any one of the above (1) to (3), wherein,
the control unit controls the notification unit as follows: if the user is recognized as holding a negative emotion before the notification is performed by the notification unit, the content of the notification is analyzed, and when the negative emotion is present in the content of the notification, the notification is temporarily interrupted after a user operation for instructing the execution of the notification, and the notification to the outside is performed according to the user operation for instructing the execution of a further notification.
(5) The information processing system according to the above (4), wherein,
the control unit controls the notification unit as follows:
when there is a negative expression in the content of the notification, the execution of the notification is temporarily interrupted after a user operation to instruct the execution of the notification, and the notification is performed after the notification is changed to a mild expression according to a further user operation.
(6) The information processing system according to the above (5), wherein,
The control unit controls the execution of the notification to be temporarily suspended, and waits for a further user operation after presenting to the user that the notification has been changed to a mild presentation.
(7) The information processing system according to any one of the above (1) to (6), wherein,
the notification is a text or image notification.
(8) The information processing system according to any one of the above (1) to (7), wherein,
the information processing system further includes a presentation unit for presenting information to a user,
the control unit controls the notification unit as follows: when the user is recognized as having a negative emotion before the notification by the notification unit, the notification unit presents the notification to the user after a user operation for instructing the execution of the notification, and the notification to the outside is executed according to the user operation for instructing the execution of the further notification.
(9) The information processing system according to the above (8), wherein,
the control unit, before notifying the user by the notifying unit, recognizes that the user has a positive emotion, and changes the performance of the notification so as to further transmit the positive emotion.
(10) The information processing system according to the above (7) or (8), wherein,
The notification is a horn.
(11) The information processing system according to any one of the above (1) to (7), wherein,
the information processing system further includes a presentation unit for presenting information to the user,
the control unit also acquires the emotion of the partner user who receives the notification,
the control unit controls the notification unit as follows: if it is estimated that the user holds a negative emotion before the notification by the notification unit, the user is notified of the emotion of the opposite user by the presentation unit, and the notification is performed after changing the content of the notification to a mild representation according to the response of the user corresponding to the emotion of the opposite user.
(12) A storage medium, wherein,
the above-mentioned storage medium has a program recorded thereon,
the above program causes a computer to function as:
a notification unit for notifying the outside; and
and a control unit configured to control the notification unit as follows, and when it is recognized that the user has a negative emotion before the notification by the notification unit, to temporarily interrupt the execution of the notification after the user operation for instructing the execution of the notification, and to execute the notification to the outside based on the user operation for instructing the execution of the further notification.
Description of the reference numerals
1. 1x … client terminal
10. 10x … control part
101 … emotion recognition section
102 … expression converting part
103 … display control unit
104 … message transmission execution control unit
105 … horn sound conversion unit
106 … horn sound output control part
11. 11x … communication part
12 … operation input portion
13 … camera
14 … sensor
15 … Sound input section
16 … display part
17 … Sound output part
18 … storage part
19 … Horn operation detection section
2 … server
20 … control part
201 … emotion recognition section
202 … representation converting part
203 … output control unit
204 … message transmission execution control unit
21 … communication part
22 … storage part

Claims (10)

1. An information processing system is provided with:
a notification unit for notifying the outside; and
a control unit that controls the notification unit as follows: if the user is recognized as holding a negative emotion before the notification is performed by the notification unit, the notification is temporarily interrupted after the user operation for instructing the notification is performed, and the notification to the outside is performed according to the user operation for instructing the further notification is performed,
wherein the control unit controls the notification unit as follows: if the user is recognized as holding a negative emotion before the notification is performed by the notification unit, the content of the notification is analyzed, and if the content of the notification includes a negative expression, the notification is temporarily interrupted after the user operation for instructing the execution of the notification is performed, and the notification to the outside is performed according to the user operation for instructing the further execution of the notification.
2. The information handling system of claim 1, wherein,
the control unit controls the notification unit as follows: if the content of the notification includes a negative performance, the execution of the notification is temporarily interrupted after the user operation indicating the execution of the notification is performed, and the notification is performed after the notification is changed to a mild performance according to a further user operation.
3. The information handling system of claim 2, wherein,
the control unit controls the notification unit to interpret the content of the notification of the change to the mild expression and then notify the content.
4. The information handling system of claim 2, wherein,
the control unit controls the execution of the notification to be temporarily suspended, and waits for a further user operation after presenting to the user that the notification has been changed to a mild presentation.
5. The information handling system of claim 1, wherein,
the notification is a text or image notification.
6. The information handling system of claim 1, wherein,
the information processing system further includes a presentation unit for presenting information to a user,
the control unit controls the notification unit as follows: when the user is recognized as having a negative emotion before the notification by the notification unit, the notification unit presents the notification to the user after the user operation for instructing the execution of the notification is executed, and the notification to the outside is executed according to the user operation for instructing the further execution of the notification.
7. The information handling system of claim 6, wherein,
the control unit is configured to: if it is determined that the user has a positive emotion before the notification by the notification unit, the performance of the notification is changed so as to further transmit the positive emotion.
8. The information handling system of claim 5, wherein,
the notification is a horn.
9. The information handling system of claim 1, wherein,
the information processing system further includes a presentation unit for presenting information to the user,
the control part also obtains the emotion of the opposite user receiving the notification, and
the control unit controls the notification unit as follows: if it is estimated that the user has a negative emotion before the notification by the notification unit, the user is notified of the emotion of the opposite user by the presentation unit, and the notification is performed after changing the content of the notification to a mild representation according to the response of the user corresponding to the emotion of the opposite user.
10. A storage medium having a program recorded thereon, wherein the program causes a computer to function as:
a notification unit for notifying the outside; and
A control unit that controls the notification unit as follows: if the user is recognized as having a negative emotion before the notification is performed by the notification unit, the notification is temporarily interrupted after the user operation for instructing the notification is performed, the notification is externally performed according to the user operation for instructing the further execution of the notification,
wherein the control unit is configured to control the notification unit as follows: if the user is recognized as holding a negative emotion before the notification is performed by the notification unit, the content of the notification is analyzed, and if the content of the notification includes a negative expression, the execution of the notification is temporarily interrupted after the user operation for instructing the execution of the notification is executed, and the notification to the outside is executed according to the user operation for instructing the further execution of the notification.
CN201880022385.8A 2017-04-06 2018-01-05 Information processing system and storage medium Active CN110462597B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017-075831 2017-04-06
JP2017075831 2017-04-06
PCT/JP2018/000062 WO2018185988A1 (en) 2017-04-06 2018-01-05 Information processing system and storage medium

Publications (2)

Publication Number Publication Date
CN110462597A CN110462597A (en) 2019-11-15
CN110462597B true CN110462597B (en) 2023-10-20

Family

ID=63713122

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880022385.8A Active CN110462597B (en) 2017-04-06 2018-01-05 Information processing system and storage medium

Country Status (2)

Country Link
CN (1) CN110462597B (en)
WO (1) WO2018185988A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2020194828A1 (en) * 2019-03-22 2020-10-01
US11283751B1 (en) 2020-11-03 2022-03-22 International Business Machines Corporation Using speech and facial bio-metrics to deliver text messages at the appropriate time
JP7360756B1 (en) 2022-04-15 2023-10-13 株式会社三鷹ホールディングス Display device using magnetic fluid

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012146219A (en) * 2011-01-13 2012-08-02 Nikon Corp Electronic apparatus and control program therefor
JP2013029928A (en) * 2011-07-27 2013-02-07 Nec Casio Mobile Communications Ltd Information terminal device, data processing method for the same and program
JP2015046065A (en) * 2013-08-28 2015-03-12 ヤフー株式会社 Information processing device, control method, and control program
CN106502316A (en) * 2015-09-04 2017-03-15 松下电器(美国)知识产权公司 Control method, communication terminal, communication system and wearable terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11250066A (en) * 1998-03-04 1999-09-17 Casio Comput Co Ltd Electronic mail device and medium for recording electronic mail processing program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012146219A (en) * 2011-01-13 2012-08-02 Nikon Corp Electronic apparatus and control program therefor
JP2013029928A (en) * 2011-07-27 2013-02-07 Nec Casio Mobile Communications Ltd Information terminal device, data processing method for the same and program
JP2015046065A (en) * 2013-08-28 2015-03-12 ヤフー株式会社 Information processing device, control method, and control program
CN106502316A (en) * 2015-09-04 2017-03-15 松下电器(美国)知识产权公司 Control method, communication terminal, communication system and wearable terminal

Also Published As

Publication number Publication date
WO2018185988A1 (en) 2018-10-11
CN110462597A (en) 2019-11-15

Similar Documents

Publication Publication Date Title
CN110609620B (en) Human-computer interaction method and device based on virtual image and electronic equipment
WO2017130486A1 (en) Information processing device, information processing method, and program
CN110462597B (en) Information processing system and storage medium
CN109040641B (en) Video data synthesis method and device
CN111372119A (en) Multimedia data recording method and device and electronic equipment
KR102415552B1 (en) Display device
CN108683850B (en) Shooting prompting method and mobile terminal
WO2021008538A1 (en) Voice interaction method and related device
CN108197299B (en) Photographing and question searching method and system based on handheld photographing equipment
CN109819167B (en) Image processing method and device and mobile terminal
CN111522524B (en) Presentation control method and device based on conference robot, storage medium and terminal
KR102443636B1 (en) Electronic device and method for providing information related to phone number
CN108133708B (en) Voice assistant control method and device and mobile terminal
CN112489647A (en) Voice assistant control method, mobile terminal and storage medium
JP6973380B2 (en) Information processing device and information processing method
WO2016206642A1 (en) Method and apparatus for generating control data of robot
CN109819331B (en) Video call method, device and mobile terminal
JP2017211430A (en) Information processing device and information processing method
CN110728206A (en) Fatigue driving detection method and device, computer readable storage medium and terminal
CN112700783A (en) Communication sound changing method, terminal equipment and storage medium
CN114627872A (en) Virtual human voice tone control method, equipment and computer readable storage medium
CN114065168A (en) Information processing method, intelligent terminal and storage medium
US11513768B2 (en) Information processing device and information processing method
CN109558853B (en) Audio synthesis method and terminal equipment
CN109542293B (en) Menu interface setting method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant