WO2019187593A1 - Dispositif et procédé de traitement d'informations ainsi que programme - Google Patents

Dispositif et procédé de traitement d'informations ainsi que programme Download PDF

Info

Publication number
WO2019187593A1
WO2019187593A1 PCT/JP2019/002858 JP2019002858W WO2019187593A1 WO 2019187593 A1 WO2019187593 A1 WO 2019187593A1 JP 2019002858 W JP2019002858 W JP 2019002858W WO 2019187593 A1 WO2019187593 A1 WO 2019187593A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
avatar
behavior state
input
information processing
Prior art date
Application number
PCT/JP2019/002858
Other languages
English (en)
Japanese (ja)
Inventor
賢次 杉原
真里 斎藤
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Priority to US17/040,194 priority Critical patent/US20210014457A1/en
Publication of WO2019187593A1 publication Critical patent/WO2019187593A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/157Conference systems defining a virtual conference space and using avatars or agents
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/542Event management; Broadcasting; Multicasting; Notifications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers

Definitions

  • This disclosure relates to an information processing apparatus, an information processing method, and a program.
  • the 1st action state acquisition part which acquires the 1st action state information about the action state of the 1st user who exists in the 1st location, and the action of the 2nd user who exists in the 2nd location
  • a second behavior state acquisition unit that acquires second behavior state information related to a state, and a first avatar representing a first user provided so that the second user can visually recognize the second location
  • An avatar control unit that gradually changes according to the behavior state of the first user, an input determination unit that determines an input relating to the first avatar based on the second behavior state information, and the first avatar
  • an information processing apparatus comprising: a transmission unit configured to transmit a signal to a terminal of a first user existing at the first location based on an input.
  • An information processing method includes transmitting a signal to an existing first user terminal.
  • the first behavior state acquisition unit that acquires the first behavior state information related to the behavior state of the first user that exists in the first location, and the second that exists in the second location.
  • a second behavior state acquisition unit that acquires second behavior state information related to the user's behavior state, and a first that represents the first user provided so that the second user can visually recognize the second location
  • An avatar control unit that gradually changes an avatar according to the behavior state of the first user, an input determination unit that determines an input related to the first avatar based on the second behavior state information, and the first
  • a transmission unit that transmits a signal to a terminal of a first user existing in the first location based on an input related to the avatar Program is provided.
  • FIG. 3 is a block diagram illustrating a hardware configuration example of an information processing apparatus according to an embodiment of the present disclosure.
  • a plurality of constituent elements having substantially the same or similar functional configuration may be distinguished by adding different numerals after the same reference numerals. However, when it is not necessary to particularly distinguish each of a plurality of constituent elements having substantially the same or similar functional configuration, only the same reference numerals are given.
  • similar components in different embodiments may be distinguished by attaching different alphabets after the same reference numerals. However, if it is not necessary to distinguish each similar component, only the same reference numerals are given.
  • the user on the notification receiving side or the user on the notification sending side grasps the situation of the other party. For example, by grasping the action state of the user on the transmission side, the user on the reception side can estimate the urgency level of the notification. Alternatively, by grasping the behavior state of the receiving user, the transmitting user can infer whether the receiving user is likely to respond to the notification.
  • a technique that allows the user on the notification reception side or the user on the notification transmission side to easily grasp the situation of the other party will be mainly described.
  • the technology that allows the user on the notification reception side or the user on the notification transmission side to grasp the situation of the other party by an avatar that changes according to the behavior state of the other party explain.
  • FIG. 1 is a diagram illustrating a configuration example of an information processing system according to an embodiment of the present disclosure.
  • the information processing system 1 includes a transmission-side terminal 10A and a reception-side terminal 10B.
  • the transmitting terminal 10A can be used by the user A.
  • the receiving terminal 10B can be used by the user B.
  • the transmission side terminal 10A and the reception side terminal 10B are connected to the network 90 and configured to be able to communicate with each other via the network 90.
  • the sending terminal 10A exists at location X.
  • User A also exists at location X.
  • an avatar representing the user B (hereinafter, also simply referred to as “user B avatar”) exists in the location X, and the avatar of the user B can be visually recognized by the user A existing in the location X.
  • the receiving side terminal 10B exists in the location Y.
  • User B also exists in location Y.
  • an avatar representing the user A (hereinafter, also simply referred to as “user A avatar”) exists in the location Y, and the avatar of the user A is visible by the user B existing in the location Y.
  • Each of the location X and the location Y may be an area having a certain extent, and each of the location X and the location Y may be located anywhere.
  • the location X may be referred to as a first location
  • the user A may be referred to as a first user
  • the user A's avatar may be referred to as a first avatar
  • location Y may be referred to as a second location
  • user B as a second user
  • user B's avatar as a second avatar.
  • the user A tries to make a call with the user B.
  • the user A performs a notification start operation on the transmitting terminal 10A.
  • the notification corresponds to a call before a call is actually started.
  • “notification” is also referred to as “call”.
  • various call means such as voice call and video call can be used for the call.
  • the notification start operation may be regarded as a communication start permission request transmitted from the user A to the user B.
  • the change of the user A's avatar which will be described later, may be considered to be started in response to a permission request regarding the start of communication transmitted from the terminal of the first user.
  • the notification start operation may correspond to an example of the action state of the user A.
  • the transmission side terminal 10A detects the notification start operation
  • the transmission side terminal 10A starts notification to the reception side terminal 10B.
  • the information related to the behavior state of the user A may be referred to as first behavior state information.
  • information regarding the behavior state of the user B may be referred to as second behavior state information.
  • the first behavior state information may include various information regarding the behavior state of the user A, such as output signals of various sensors that sense the behavior state of the user A, and determination results of the behavior state of the user A based on the output signals.
  • the second behavior state information may include various information regarding the behavior state of the user B.
  • the user B When the user B wants to start a call with the user A, the user B performs a notification response operation on the receiving terminal 10B. When detecting the notification response operation, the receiving terminal 10B establishes a connection with the transmitting terminal 10A. As a result, a call between the user A and the user B via the transmitting terminal 10A and the receiving terminal 10B is established, and communication between the user A and the user B via the transmitting terminal 10A and the receiving terminal 10B is established. Be started.
  • the notification response operation can correspond to an example of the behavior state of the user B.
  • communication is performed between the user A and the user B by voice.
  • communication may be performed between the user A and the user B by other content (for example, video or the like) instead of the voice or in addition to the voice.
  • each of the transmitting terminal 10A and the receiving terminal 10B is a PC (Personal Computer).
  • each of the transmitting terminal 10A and the receiving terminal 10B is not limited to a PC.
  • at least one of the transmitting terminal 10A and the receiving terminal 10B may be a television device, a mobile phone, a tablet terminal, or a smartphone. It may be a wearable terminal (for example, a head-mounted display) or a camera.
  • Each of the transmission-side terminal 10A and the reception-side terminal 10B can function as an information processing apparatus.
  • call time the elapsed time from the start of notification
  • notification level the elapsed time from the start of notification
  • FIG. 2 is a diagram showing an example of the correspondence between the call time and the notification level.
  • a user A who uses the transmitting terminal 10A is shown.
  • the user A notifies the user B.
  • the transmission side terminal 10A starts notification to the reception side terminal 10B used by the user B.
  • the user B does not respond to the notification even after a first time (for example, after 10 seconds) from the start of notification (when the response is impossible).
  • a first time for example, after 10 seconds
  • user A cancels notification to user B (S21).
  • the user A continues the notification to the user B when the user B has business (S31).
  • the user B does not respond to the notification even after the second time (for example, 30 seconds) after the notification starts.
  • the second time for example, 30 seconds
  • the user A if user A has something to do with user B, but user B is busy, he / she wants to give up something for user B (S12), and then stops notification to user B (S22).
  • the user A when the user A cannot give up the business for the user B, the user A continues the notification to the user B (S32).
  • the user B does not respond to the notification even after a third time (for example, one minute) after the notification starts.
  • a third time for example, one minute
  • the user A wants to tell the user B if possible, but if he thinks that it is impossible to respond to the notification (S13), the user A stops the notification to the user B (S23).
  • the user A wants to tell the user B about the business for the user B, the user A continues the notification to the user B (S33).
  • the notification to the user B is preferably a notification that the user B is more likely to notice as the notification level is higher. At this time, the notification to the user B may gradually change to a notification that is easy to notice.
  • the notification change for the user B is a change in the avatar of the user A visually recognized by the user B.
  • FIG. 3 is a diagram illustrating an example of a correspondence relationship between an element that changes an avatar and a notification level.
  • FIG. 3 an example of a correspondence relationship between a call time and a notification level as an example of an element that changes user A's avatar is shown.
  • the correspondence between the call time and the notification level is as described with reference to FIG.
  • FIG. 3 shows an example of a correspondence relationship between the stress level of the user A and the notification level as an example of an element that changes the avatar of the user A.
  • the stress level of the user A may be detected in any way.
  • the stress level of the user A may be estimated / acquired based on an image recognition result with respect to the image of the user A captured by the imaging device, or based on biological information sensed by the biological sensor from the user A. May be estimated / obtained.
  • FIG. 4 is a diagram illustrating a functional configuration example of the transmission-side terminal 10A.
  • the transmission-side terminal 10A includes a control unit 110A, a sensor unit 120A, a communication unit 140A, a storage unit 150A, and a presentation unit 160A.
  • these functional blocks provided in the transmission-side terminal 10A will be described.
  • the control unit 110A may be configured by a processing device such as one or a plurality of CPUs (Central Processing Units). When these blocks are configured by a processing device such as a CPU, the processing device may be configured by an electronic circuit.
  • the control unit 110A can be realized by executing a program by such a processing device.
  • the control unit 110A includes a transmission-side user behavior state acquisition unit 111A, an input determination unit 112A, a transmission unit 113A, a reception-side user behavior state acquisition unit 114A, and an avatar control unit 115A.
  • the transmission-side user behavior state acquisition unit 111A can correspond to an example of a second behavior state acquisition unit.
  • the receiving-side user behavior state acquisition unit 114A can correspond to an example of a first behavior state acquisition unit. Detailed functions of these blocks will be described later.
  • the sensor unit 120A has various sensors, and detects various sensing data by the various sensors. More specifically, the sensor unit 120 ⁇ / b> A detects the voice uttered by the user A and the state of the user A. The state of the user A can include the behavior state of the user A. 120 A of sensor parts may be provided in the arbitrary places of 10 A of transmission side terminals.
  • the sensor unit 120A includes a microphone and an imaging device. Then, it is assumed that the voice emitted from the user A is detected by the microphone and the voice emitted from the user A is used for communication. However, instead of the voice uttered by the user A or in addition to the voice uttered by the user A, an image captured by the imaging device of the user A may be used for communication.
  • the state of the user A is detected by the imaging device.
  • the state of the user A may be detected by a sensor other than the imaging device.
  • the transmitting terminal 10A is a wearable terminal
  • the state of the user A may be detected by a wearable terminal sensor (for example, an acceleration sensor, a gyro sensor, a vibration sensor, a GPS (Global Positioning System) sensor, etc.).
  • a wearable terminal sensor for example, an acceleration sensor, a gyro sensor, a vibration sensor, a GPS (Global Positioning System) sensor, etc.
  • the communication unit 140A includes a communication circuit, and has a function of performing communication with the receiving terminal 10B via the network 90.
  • the communication unit 140A has a function of acquiring data from the receiving terminal 10B and providing data to the receiving terminal 10B.
  • the communication unit 140A transmits a notification to the reception-side terminal 10B via the network 90 when the notification start operation by the user A is detected by the sensor unit 120A.
  • the communication unit 140A receives a notification response from the reception-side terminal 10B via the network 90, the communication unit 140A establishes a connection with the reception-side terminal 10B via the network 90.
  • the communication part 140A will transmit the said audio
  • the storage unit 150A includes a memory, and is a recording medium that stores a program executed by the control unit 110A and stores data necessary for executing the program.
  • the storage unit 150A temporarily stores data for calculation by the control unit 110A.
  • the storage unit 150A includes a magnetic storage device, a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the presenting unit 160A presents various information to the user A.
  • the presentation unit 160A includes a display and a speaker.
  • the type of display is not limited.
  • the display may be a liquid crystal display, an organic EL (Electro-Luminescence) display, or a projector that can project onto a wall or the like.
  • the display may be a light such as an LED (light-emitting diode).
  • the user B's avatar is a virtual object and the display displays the user B's avatar.
  • the avatar of the user B may be a real object.
  • the real object may be, for example, a moving body having a driving mechanism. More specifically, various forms such as a moving body including a roller, a wheel, or a tire, a biped walking robot, a quadruped walking robot, or the like can be adopted.
  • the self-supporting mobile body can be configured as an independent information processing apparatus.
  • the presenting unit 160A may not have a display.
  • the speaker outputs the sound when the connection with the receiving terminal 10B is established by the communication unit 140A and the sound emitted by the user B is received from the receiving terminal 10B via the network 90. The sound output from the speaker is perceived by the hearing of the user A.
  • control unit 110A, the sensor unit 120A, the communication unit 140A, the storage unit 150A, and the presentation unit 160A exist inside the transmission-side terminal 10A.
  • the control unit 110A, the sensor unit 120A, the communication unit 140A, the storage unit 150A, and the presentation unit 160A may exist outside the transmission-side terminal 10A.
  • FIG. 5 is a diagram illustrating a functional configuration example of the reception-side terminal 10B.
  • the reception-side terminal 10B includes a control unit 110B, a sensor unit 120B, a communication unit 140B, a storage unit 150B, and a presentation unit 160B.
  • these functional blocks provided in the receiving terminal 10B will be described.
  • the control unit 110B may be configured by a processing device such as one or a plurality of CPUs (Central Processing Units). When these blocks are configured by a processing device such as a CPU, the processing device may be configured by an electronic circuit. The control unit 110B can be realized by executing a program by the processing device.
  • a processing device such as one or a plurality of CPUs (Central Processing Units).
  • CPUs Central Processing Units
  • the processing device may be configured by an electronic circuit.
  • the control unit 110B can be realized by executing a program by the processing device.
  • the control unit 110B includes a reception-side user behavior state acquisition unit 111B, an input determination unit 112B, a transmission unit 113B, a transmission-side user behavior state acquisition unit 114B, and an avatar control unit 115B.
  • the receiving-side user behavior state acquisition unit 111B can correspond to an example of a second behavior state acquisition unit.
  • the transmission-side user behavior state acquisition unit 114B can correspond to an example of a first behavior state acquisition unit. Detailed functions of these blocks will be described later.
  • the sensor unit 120B has various sensors, and detects various sensing data by the various sensors. More specifically, the sensor unit 120B detects the voice emitted by the user B and the state of the user B.
  • the state of user B may include the behavior state of user B.
  • the sensor unit 120B may be provided at an arbitrary location on the receiving side terminal 10B.
  • the sensor unit 120B includes a microphone and an imaging device. Then, it is assumed that the voice emitted by the user B is detected by the microphone and the voice emitted by the user B is used for communication. However, instead of the sound uttered by the user B or in addition to the sound uttered by the user B, an image of the user B captured by the imaging device may be used for communication.
  • the state of the user B is detected by the imaging device.
  • the state of the user B may be detected by a sensor other than the imaging device.
  • the receiving terminal 10B is a wearable terminal
  • the state of the user B may be detected by a wearable terminal sensor (for example, an acceleration sensor, a gyro sensor, a vibration sensor, a GPS (Global Positioning System) sensor, etc.).
  • a wearable terminal sensor for example, an acceleration sensor, a gyro sensor, a vibration sensor, a GPS (Global Positioning System) sensor, etc.
  • the communication unit 140B includes a communication circuit, and has a function of performing communication with the transmitting terminal 10A via the network 90.
  • the communication unit 140B has a function of acquiring data from the transmitting terminal 10A and providing data to the transmitting terminal 10A.
  • the communication unit 140B transmits a notification response to the transmitting terminal 10A via the network 90, and communicates with the transmitting terminal 10A via the network 90. Establish a connection.
  • the communication part 140B will transmit the said audio
  • the communication unit 140B receives a notification from the transmission-side terminal 10A via the network 90.
  • the storage unit 150B includes a memory, and is a recording medium that stores a program executed by the control unit 110B and stores data necessary for executing the program.
  • the storage unit 150B temporarily stores data for calculation by the control unit 110B.
  • the storage unit 150B includes a magnetic storage unit device, a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the presentation unit 160B presents various information to the user B.
  • the presentation unit 160B includes a display and a speaker.
  • the type of display is not limited.
  • the display may be a liquid crystal display, an organic EL (Electro-Luminescence) display, or a projector that can project onto a wall or the like.
  • the display may be a light such as an LED (light-emitting diode).
  • the user A's avatar is a virtual object, and the display displays the user A's avatar.
  • the user A's avatar may be a real object (for example, a robot).
  • the presentation unit 160B may not have a display.
  • the speaker outputs the sound when the communication unit 140B establishes the connection with the transmitting terminal 10A and receives the sound uttered by the user A from the transmitting terminal 10A via the network 90. The sound output from the speaker is perceived by the hearing of the user B.
  • control unit 110B, the sensor unit 120B, the communication unit 140B, the storage unit 150B, and the presentation unit 160B are present inside the reception-side terminal 10B.
  • the control unit 110B, the sensor unit 120B, the communication unit 140B, the storage unit 150B, and the presentation unit 160B may exist outside the reception-side terminal 10B.
  • the sensor unit 120A obtains information regarding the action state of the user A by sensing.
  • the action state of the user A may include notification start, notification continuation, notification end, re-notification, and the like.
  • the sensor unit 120A may be, for example, a sensor that acquires image information or depth information of the user A.
  • 112 A of input determination parts determine the input regarding the avatar showing the user B based on the information regarding the user's A action state.
  • the input related to the avatar representing the user B may include notification start, notification continuation, notification end, re-notification, and the like.
  • 113 A of transmission parts transmit a signal (signal for controlling the avatar A) to the receiving side terminal 10B based on the input regarding the avatar showing the user B.
  • the communication unit 140 ⁇ / b> B receives information regarding the behavior state of the user A (signal for controlling the avatar A) via the network 90.
  • the transmission-side user behavior state acquisition unit 114B acquires information regarding the behavior state of the user A.
  • the avatar control unit 115B gradually changes the avatar A representing the user A according to the action state of the user A. The user B can easily grasp the situation of the user A by an avatar that changes according to the action state of the user A.
  • the sensor unit 120B obtains information on the behavior state of the user B by sensing.
  • the action state of the user B can include whether or not the notice is noticed, whether or not the notice is responded, and the like.
  • the sensor unit 120B may be a sensor that acquires image information or depth information of the user B, for example.
  • the input determination unit 112B determines an input related to the avatar representing the user A based on the information related to the behavior state of the user B.
  • the input related to the avatar representing the user A may include whether or not the notice is noticed or whether or not the notice is responded.
  • the transmission unit 113B transmits a signal (a signal for controlling the avatar B) to the transmission-side terminal 10A.
  • the transmission unit 113B transmits a first signal indicating the start of communication to the transmission side terminal 10A based on the first input, and denies the transmission side terminal 10A not to start communication based on the second input. It may be assumed that the second signal shown is transmitted. Further, the transmission unit 113B may be regarded as transmitting a signal indicating the action state of the user B to the transmitting terminal 10A of the user A. As will be described later, the transmission unit 113B can change a signal indicating the action state of the user B based on the input related to the avatar A of the user B. Note that the signal transmitted by the transmission unit 113B may be regarded as a signal for controlling the avatar B controlled by the transmission-side terminal 10A.
  • the signal transmitted by the transmission unit 113B may be a signal that directly controls the avatar B.
  • the signal transmitted by the transmission unit 113B may be converted into a signal for controlling the avatar B through processing by the network 90.
  • the signal which transmission part 113B transmitted may be converted into the signal which controls avatar B through the process by 10 A of transmission side terminals.
  • the communication unit 140A receives information on the behavior state of the user B (signal for controlling the avatar B) via the network 90.
  • the receiving-side user behavior state acquisition unit 114A acquires information regarding the behavior state of the user B.
  • the avatar control unit 115A gradually changes the avatar B representing the user B according to the action state of the user B. The user A can easily grasp the situation of the user B by an avatar that changes in accordance with the behavior state of the user B.
  • FIG. 6 is a diagram for explaining an example of avatar control.
  • the transmission side user A exists at the location X.
  • the sensor unit 120A can detect various interactions (for the avatar B) by the user A.
  • the receiving side user B exists in the location Y.
  • the sensor unit 120B can detect various interactions (for the avatar A) by the user B.
  • S101 indicates a state in which the user A talks to the avatar B that exists in the location X and represents the state of the user B.
  • movement, action) of the user A is determined by the sensor unit 120A existing in the location X.
  • the notification (call) from the user A to the user B is started, the notification start is acquired by the transmission-side user behavior state acquisition unit 114B, and the state of the avatar A representing the state of the user A existing in the location Y is obtained.
  • the state is changed to a state indicating reception of the notification from the user A by the avatar control unit 115B.
  • S ⁇ b> 102 it is determined by the sensor unit 120 ⁇ / b> B existing in the location Y that the user B has noticed the notification from the user A, like the location X. For example, whether or not the user B is aware of the notification may be determined based on whether or not the line of sight of the user B hits the avatar A.
  • the fact that the user B notices the contact of the user A is acquired by the receiving-side user behavior state acquisition unit 114A via the network 90, and is notified to the user A by the operation of the avatar B according to the control by the avatar control unit 115A.
  • the second input in the present disclosure may be considered to include “user B noticed notification from user A”.
  • the operation and position of the avatar A existing in the location Y change according to the notification level.
  • the avatar A moves to a range with a high degree of attraction for the user B and / or increases the amount of movement. More specifically, the avatar A may move between the object on which the user B is working and the user B, thereby hindering the work (work) of the user B.
  • the fact that the user A continued the call is acquired by the transmission-side user behavior state acquisition unit 114B, and the avatar A operates in a stepwise manner according to the control by the avatar control unit 115B.
  • the avatar B changes to a state indicating user B's communication refusal (response refusal) according to the control by the avatar control unit 115A.
  • the avatar A may change so as to indicate that the user A has been discouraged so as to indicate that the response rejection from the user B has been transmitted to the user A according to the control by the avatar control unit 115B.
  • S105 shows a state in which the user A continues to notify (re-notify) the user B, although the user B confirms the rejection of the response.
  • the continuation of the notification is received by the communication unit 140B and acquired by the transmitting-side user behavior state acquisition unit 114B, the avatar A may change so as to indicate a higher degree of urgency according to the control by the avatar control unit 115B.
  • S106 shows a state in which user B has finally started communication with user A in response to a change in avatar A.
  • the communication may be started when the sensor unit 120A detects that the user B talks to the avatar A, similarly to the notification start at the location X.
  • the input determination unit 112B may determine that the user B has started communication based on image information or depth information indicating that the user B has performed a specific gesture.
  • the input determination unit 112B indicates that the user B notices the notification or the user B rejects the response based on the image information or depth information of the user B indicating that the user B is not performing a specific gesture. May be determined.
  • Such a specific gesture may be, for example, a gesture in which the hand of the user B approaches the face, more specifically, a gesture in which the hand of the user B approaches the face.
  • the proximity of the hand and the face can be determined based on whether or not the distance between the hand and the face is within a predetermined value.
  • the specific gesture may be a gesture in which the hand and the ear of the user B are close to each other, that is, a gesture in which the handset is generally placed on the ear.
  • the input determination unit 112B determines that the user B has noticed the notification based on the user B's orientation or line-of-sight information based on the user B's image information or depth information, and performs communication based on the user B's voice information.
  • the start may be determined.
  • the user B can control the start of communication with the user A by a natural operation without performing a specific gesture.
  • the input determination unit 112B determines the input related to the avatar A based on the intentional input (first input) of the user B and the relatively unintentional input (second input). It may be deceived.
  • FIG. 7 is a flowchart showing an operation example of the transmitting terminal 10A.
  • an operation example of the transmitting terminal 10A will be described.
  • S201 is repeatedly executed.
  • the notification start operation by the user A is detected by the sensor unit 120A and the notification by the communication unit 140A is started (“Yes” in S201)
  • the operation is shifted to S202.
  • the receiving-side user B responds to the notification, the response is received by the communication unit 140A, and the response is acquired by the receiving-side user behavior state acquisition unit 114A (“Yes” in S202), the communication unit 140A receives A connection with the side terminal 10B is established. Thereby, the communication unit 140A starts communication between the user A and the user B (S203). At this time, the avatar control unit 115A controls the avatar B so as to indicate the start of communication.
  • user B When receiving user B does not respond to the notification (“No” in S202) and does not reject the response (“No” in S211), user B is aware of the notification by receiving-side user behavior acquisition unit 111B. If not obtained, the avatar control unit 115A controls the avatar B so as to indicate that the user B is not aware of the notification.
  • the transmission-side user behavior state acquisition unit 111A determines the notification level (S221).
  • the notification level is an example of information related to the behavior state of the user A, and can be estimated / acquired based on the call time or the stress level of the user A as described above.
  • the notification level is not updated (“No” in S222)
  • the operation may be shifted to S224.
  • the transmission unit 113A transmits the updated notification level to the receiving terminal 10B (S223), and proceeds to S224.
  • the notification level may be updated to increase as the call time becomes longer, or the notification level may be updated to increase as the stress level of the user A increases.
  • the notification level may be updated to increase.
  • the notification level may be updated so as to decrease. For example, when it is detected that the sending user A has started a specific action other than communication with the user B, or when a specific voice is detected by the user A, the notification level is lowered. It may be updated.
  • the present notification level may be presented to the user A by the presenting unit 160A.
  • the current notification level may be presented to user A in any way.
  • the current notification level may be displayed on the display as a numerical value.
  • an animation with an expression corresponding to the current notification level (for example, an animation at the time of a call) may be displayed on the display. Further, the user A can set the notification level not to be updated.
  • avatar control unit 115A controls avatar B to indicate response rejection. To do. As will be described later, the response rejection may be provided with a level indicating the strength of the rejection (hereinafter also referred to as “rejection level”). At this time, the avatar control unit 115A may control the avatar B so as to be in a different state depending on the rejection level.
  • the transmission side user behavior state acquisition unit 111A acquires the acceptance by the user A for the response rejection (“Yes” in S212), the operation ends.
  • the transmission-side user behavior state acquisition unit 111A does not acquire the acceptance by the user A for the response rejection (“No” in S212) and the user A does not acquire the rejection for the response rejection (“No” in S213), The operation ends.
  • the notification level is updated (S214), and the transmission unit 113A notifies the updated notification
  • the level is transmitted to the receiving terminal 10B (S215), and the operation is shifted to S202.
  • the notification level is updated as described above.
  • FIG. 8 is a flowchart showing an operation example of the receiving terminal 10B. An operation example of the receiving terminal 10B will be described with reference to FIG. As shown in FIG. 8, when a notification level is received by the communication unit 140B from the transmission-side terminal 10A via the network 90, the notification level is acquired by the transmission-side user behavior state acquisition unit 114B and notified by the avatar control unit 115B. The avatar A is controlled according to the level (the avatar A is changed) (S302).
  • the operation proceeds to S301. Is migrated.
  • the operation proceeds to S204.
  • the communication unit 140B Establish a connection with the transmitting terminal 10A. Accordingly, the communication unit 140B starts communication between the user A and the user B (S305). At this time, the avatar control unit 115B controls the avatar A so as to indicate the start of communication.
  • S306 is repeatedly executed.
  • the sensor unit 120B detects that the receiving side user B has finished communication and is acquired by the receiving side user behavior state acquiring unit 111B (“Yes” in S306), the operation of the receiving side terminal 10B is performed. finish. If the receiving user B does not respond to the notification (“No” in S304), the operation proceeds to S311.
  • the operation proceeds to S321.
  • the notification end of the transmission side user is received by the communication unit 140B and acquired by the transmission side user behavior state acquisition unit 114B (“Yes” in S321), the operation ends. If the notification end of the transmission side user is not received by the communication unit 140B and is not acquired by the transmission side user behavior state acquisition unit 114B (“No” in S321), the operation proceeds to S301.
  • the response rejection by the receiving user B is detected by the sensor unit 120B and acquired by the receiving user behavior state acquisition unit 111B (“Yes” in S311), the response rejection is transmitted to the transmitting terminal 10A, and the operation in S312 Is migrated.
  • the response rejection may be detected in any way.
  • the response refusal may be detected by detecting that the user B has removed his / her line of sight from the avatar A, or detected by detecting that the user B has started a specific action other than communication with the user A.
  • it may be detected by detecting an explicit action that the user cannot respond.
  • the rejection level may be transmitted together with the response rejection.
  • the refusal level may be input by the user B in any way.
  • the refusal level may be input by an operation by the user B (for example, a button operation), or may be input by a voice uttered by the user B (for example, a specific voice such as “cannot respond”).
  • it may be input by a gesture by the user B (for example, a specific action that interrupts the notification).
  • the operation proceeds to S301.
  • the communication unit 140B does not receive a denial by the sending user A for a response refusal by the receiving user B (“No” in S313), the operation ends.
  • FIG. 9 is a block diagram illustrating a hardware configuration example of the information processing apparatus 10 according to the embodiment of the present disclosure.
  • the information processing apparatus 10 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
  • the information processing apparatus 10 includes a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925. Further, the information processing apparatus 10 includes an imaging device 933 and a sensor 935.
  • the information processing apparatus 10 may include a processing circuit called a DSP (Digital Signal Processor) or ASIC (Application Specific Integrated Circuit) instead of or in addition to the CPU 901.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • the CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 10 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
  • the ROM 903 stores programs and calculation parameters used by the CPU 901.
  • the RAM 905 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
  • the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
  • PCI Peripheral Component Interconnect / Interface
  • the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
  • the input device 915 may include a microphone that detects the user's voice.
  • the input device 915 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the information processing device 10.
  • the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data to the information processing device 10 or instruct a processing operation.
  • the output device 917 is a device that can notify the user of the acquired information visually or audibly.
  • the output device 917 is, for example, an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), an organic EL (Electro-Luminescence) display, a display device such as a projector, a hologram display device, a sound output device such as a speaker and headphones, As well as a printer device.
  • the output device 917 outputs the result obtained by the processing of the information processing device 10 as a video such as text or an image, or as a sound such as voice or sound.
  • the output device 917 may include a light such as an LED (light-emitting diode).
  • the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 10.
  • the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage device 919 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
  • the drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 10.
  • the drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905.
  • the drive 921 writes a record in the attached removable recording medium 927.
  • the connection port 923 is a port for directly connecting a device to the information processing apparatus 10.
  • the connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
  • the connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
  • Various data can be exchanged between the information processing apparatus 10 and the external connection device 929 by connecting the external connection device 929 to the connection port 923.
  • the communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931.
  • the communication device 925 can be, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
  • the communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
  • the communication network 931 connected to the communication device 925 is a wired or wireless network, such as the Internet, a home LAN, infrared communication, radio wave communication, or satellite communication.
  • the imaging device 933 uses various members such as an imaging element such as a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor), and a lens for controlling the imaging of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image.
  • the imaging device 933 may capture a still image or may capture a moving image.
  • the sensor 935 is various sensors such as a distance measuring sensor, an acceleration sensor, a gyro sensor, a geomagnetic sensor, a vibration sensor, an optical sensor, and a sound sensor.
  • the sensor 935 acquires information about the state of the information processing apparatus 10 itself, such as the attitude of the housing of the information processing apparatus 10, and information about the surrounding environment of the information processing apparatus 10, such as brightness and noise around the information processing apparatus 10.
  • the sensor 935 may include a GPS sensor that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the apparatus.
  • GPS Global Positioning System
  • the first behavior state acquisition unit that acquires the first behavior state information related to the behavior state of the first user existing in the first location, and the second location
  • a second behavior state acquisition unit that acquires second behavior state information relating to a behavior state of the second user existing in the first location, and a first behavior provided so that the second user can visually recognize the second location.
  • An avatar control unit that gradually changes a first avatar representing a user according to the behavior state of the first user, and an input determination that determines an input related to the first avatar based on the second behavior state information.
  • an information processing apparatus comprising: a transmission unit configured to transmit a signal to a terminal of a first user existing at the first location based on an input related to the first avatar. It is.
  • the user on the notification receiving side or the user on the notification transmitting side can easily grasp the situation of the other party.
  • a program for causing hardware such as a CPU, ROM, and RAM incorporated in a computer to exhibit functions equivalent to the functions of the control unit 110A.
  • a computer-readable recording medium that records the program can be provided.
  • it is possible to create a program for causing hardware such as a CPU, ROM, and RAM incorporated in a computer to exhibit functions equivalent to the functions of the control unit 110B.
  • a computer-readable recording medium that records the program can be provided.
  • the above mainly describes the case where the transmission-side user behavior state acquisition unit 111A, the input determination unit 112A, the transmission unit 113A, the reception-side user behavior state acquisition unit 114A, and the avatar control unit 115A are incorporated in the transmission-side terminal 10A. did. However, some of these functions may be incorporated in a device different from the transmitting terminal 10A.
  • the input determination unit 112A may be incorporated in a device (for example, a server) different from the transmission side terminal 10A.
  • reception-side user behavior state acquisition unit 111B, the input determination unit 112B, the transmission unit 113B, the transmission-side user behavior state acquisition unit 114B, and the avatar control unit 115B are incorporated in the reception-side terminal 10B.
  • some of these functions may be incorporated in a device different from the receiving terminal 10B.
  • the input determination unit 112B may be incorporated in a device (for example, a server) different from the receiving terminal 10B.
  • the avatar is provided so as to be visible to the user has been mainly described.
  • the presence of the avatar may be presented to the user without using visually recognizable information by using an output device capable of performing so-called sound image localization. That is, the avatar may be regarded as an agent localized at any position in the space, and the method of providing it to the user is not limited to display control.
  • an output device that performs such sound image localization an open speaker that localizes the sound image of the avatar in space based on the head related transfer function (HRTF) may be used.
  • HRTF head related transfer function
  • a first behavior state acquisition unit that acquires first behavior state information related to the behavior state of the first user existing in the first location
  • a second behavior state acquisition unit that acquires second behavior state information related to the behavior state of the second user existing in the second location
  • An avatar control unit that gradually changes a first avatar representing a first user provided to be visible to the second user at the second location according to the behavior state of the first user.
  • An input determination unit that determines an input related to the first avatar based on the second behavior state information
  • a transmission unit that transmits a signal to a terminal of a first user existing in the first location based on an input related to the first avatar
  • An information processing apparatus comprising: (2) The input relating to the first avatar includes at least one of a first input or a second input; The transmitter is Based on the first input, a first signal indicating the start of communication is transmitted to the terminal of the first user, The information processing apparatus according to (1), wherein a second signal indicating that communication is not permitted is transmitted to the terminal of the first user based on the second input.
  • the avatar control unit displays the state of the first avatar.
  • the information processing apparatus according to (2) wherein the information processing apparatus changes to a state indicating re-notification from the first user.
  • the first input is an input by the second user that is relatively intentional compared to the second input.
  • An input determination unit that determines the first input and the second input based on image information or depth information of the second user; The first input includes information regarding a particular gesture; The information processing apparatus according to (4), wherein the second input does not include information regarding the specific gesture.
  • the information processing apparatus according to (5), wherein the specific gesture is a gesture in which the hand of the second user approaches the face of the second user.
  • An input determining unit that determines the first input based on the voice information of the second user and further determines the second input based on the image information or depth information of the second user, The information processing apparatus according to (4).
  • the input determination unit determines that the second user has recognized a change in the first avatar as the second input based on the image information or depth information of the second user.
  • the information processing apparatus according to any one of 4) to (7).
  • the avatar control unit starts changing the display of the first avatar in response to a permission request regarding the start of communication with the second user, transmitted from the terminal of the first user.
  • the information processing apparatus according to any one of (8) to (8).
  • Processing equipment (11) The information processing apparatus according to any one of (1) to (10), wherein the transmission unit transmits a signal indicating an action state of the second user to the terminal of the first user. (12) The information processing apparatus according to (11), wherein the transmission unit changes a signal indicating an action state of the second user based on an input related to the first avatar. (13) The information processing apparatus according to (12), wherein the transmission unit transmits a signal for controlling an avatar indicating the second user controlled by the terminal of the first user.
  • the information processing apparatus according to any one of (1) to (13), further including a display device that displays the first avatar.
  • the first avatar is a moving body having a drive mechanism.
  • (16) Obtaining first behavior state information relating to the behavior state of the first user present at the first location; Obtaining second behavior state information relating to the behavior state of the second user present at the second location; Gradually changing a first avatar representing a first user provided to be visible to the second user at the second location in accordance with the behavior state of the first user; Determining an input related to the first avatar based on the second behavior state information;
  • a processor sends a signal to a terminal of a first user at the first location based on an input related to the first avatar; Including an information processing method.
  • a first behavior state acquisition unit that acquires first behavior state information related to the behavior state of the first user existing in the first location;
  • a second behavior state acquisition unit that acquires second behavior state information related to the behavior state of the second user existing in the second location;
  • An avatar control unit that gradually changes a first avatar representing a first user provided to be visible to the second user at the second location according to the behavior state of the first user.
  • An input determination unit that determines an input related to the first avatar based on the second behavior state information;
  • a transmission unit that transmits a signal to a terminal of a first user existing in the first location based on an input related to the first avatar;
  • a program for causing an information processing apparatus to function.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Information Transfer Between Computers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Telephone Function (AREA)

Abstract

Le problème décrit par la présente invention est de fournir une technologie au moyen de laquelle un utilisateur d'un côté réception de notification ou un utilisateur d'un côté transmission de notification peut comprendre facilement une circonstance de contrepartie. La solution selon l'invention porte sur un dispositif de traitement d'informations comprenant : une première unité d'acquisition d'état de comportement qui acquiert des premières informations d'état de comportement relatives à un état de comportement d'un premier utilisateur qui est présent à un premier emplacement ; une seconde unité d'acquisition d'état de comportement qui acquiert des secondes informations d'état de comportement relatives à un état de comportement d'un second utilisateur qui est présent à un second emplacement ; une unité de commande d'avatar qui modifie progressivement, conformément à l'état de comportement du premier utilisateur, un premier avatar qui représente le premier utilisateur, qui est prévu pour être visuellement reconnaissable par le second utilisateur au second emplacement ; une unité de détermination d'entrée qui détermine une entrée relative au premier avatar sur la base des secondes informations d'état de comportement ; et une unité de transmission qui transmet, à un terminal du premier utilisateur qui est présent au premier emplacement, un signal sur la base de l'entrée associée au premier avatar.
PCT/JP2019/002858 2018-03-30 2019-01-29 Dispositif et procédé de traitement d'informations ainsi que programme WO2019187593A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/040,194 US20210014457A1 (en) 2018-03-30 2019-01-29 Information processing device, information processing method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018069787A JP2021099538A (ja) 2018-03-30 2018-03-30 情報処理装置、情報処理方法およびプログラム
JP2018-069787 2018-03-30

Publications (1)

Publication Number Publication Date
WO2019187593A1 true WO2019187593A1 (fr) 2019-10-03

Family

ID=68061299

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/002858 WO2019187593A1 (fr) 2018-03-30 2019-01-29 Dispositif et procédé de traitement d'informations ainsi que programme

Country Status (3)

Country Link
US (1) US20210014457A1 (fr)
JP (1) JP2021099538A (fr)
WO (1) WO2019187593A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023095531A1 (fr) * 2021-11-25 2023-06-01 ソニーグループ株式会社 Dispositif, procédé et programme de traitement d'informations

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000066807A (ja) * 1998-08-18 2000-03-03 Nippon Telegr & Teleph Corp <Ntt> 気持ち入力装置、気持ち出力装置および気持ち通信システム
JP2002152386A (ja) * 2000-11-09 2002-05-24 Sony Corp 通信システム、通信方法及び通信端末
WO2011004652A1 (fr) * 2009-07-09 2011-01-13 日本電気株式会社 Dispositif d’indication d’événement, procédé d’indication d’événement, programme et support d’enregistrement
JP2014059894A (ja) * 2008-05-27 2014-04-03 Qualcomm Incorporated ユーザのステータスを示すためにアバタステータスを自動的に更新するための方法およびシステム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000066807A (ja) * 1998-08-18 2000-03-03 Nippon Telegr & Teleph Corp <Ntt> 気持ち入力装置、気持ち出力装置および気持ち通信システム
JP2002152386A (ja) * 2000-11-09 2002-05-24 Sony Corp 通信システム、通信方法及び通信端末
JP2014059894A (ja) * 2008-05-27 2014-04-03 Qualcomm Incorporated ユーザのステータスを示すためにアバタステータスを自動的に更新するための方法およびシステム
WO2011004652A1 (fr) * 2009-07-09 2011-01-13 日本電気株式会社 Dispositif d’indication d’événement, procédé d’indication d’événement, programme et support d’enregistrement

Also Published As

Publication number Publication date
JP2021099538A (ja) 2021-07-01
US20210014457A1 (en) 2021-01-14

Similar Documents

Publication Publication Date Title
US11153431B2 (en) Mobile terminal and method of operating the same
US10613330B2 (en) Information processing device, notification state control method, and program
EP3097458B1 (fr) Direction de sortie audio à partir de gestes
US10359839B2 (en) Performing output control based on user behaviour
WO2017130486A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations, et programme
WO2020057258A1 (fr) Procédé de traitement d&#39;informations et terminal
JP6627775B2 (ja) 情報処理装置、情報処理方法およびプログラム
WO2018139036A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations, et programme
WO2019187593A1 (fr) Dispositif et procédé de traitement d&#39;informations ainsi que programme
WO2016157993A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations, et programme
JP2016109726A (ja) 情報処理装置、情報処理方法およびプログラム
WO2015198729A1 (fr) Dispositif de commande d&#39;affichage, procédé de commande d&#39;affichage, et programme
US20210243360A1 (en) Information processing device and information processing method
WO2015125364A1 (fr) Appareil électronique et procédé de fourniture d&#39;image
JP7468506B2 (ja) 情報処理装置、情報処理方法、及び記録媒体
WO2017149848A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations et programme
US11372473B2 (en) Information processing apparatus and information processing method
US11935449B2 (en) Information processing apparatus and information processing method
WO2020031795A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations et programme
WO2018139050A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations, et programme
JP7078036B2 (ja) 情報処理装置、情報処理方法およびプログラム
KR101497181B1 (ko) 통화모드를 제어하는 방법, 모바일 단말기 및 기록매체
US11270386B2 (en) Information processing device, information processing method, and program
JP7074343B2 (ja) 情報処理装置
WO2016147693A1 (fr) Dispositif de traitement d&#39;informations, procédé et programme de traitement d&#39;informations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19777974

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19777974

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP