WO2020202353A1 - Robot de communication, son procédé de commande, serveur de traitement d'informations et procédé de traitement d'informations - Google Patents

Robot de communication, son procédé de commande, serveur de traitement d'informations et procédé de traitement d'informations Download PDF

Info

Publication number
WO2020202353A1
WO2020202353A1 PCT/JP2019/014262 JP2019014262W WO2020202353A1 WO 2020202353 A1 WO2020202353 A1 WO 2020202353A1 JP 2019014262 W JP2019014262 W JP 2019014262W WO 2020202353 A1 WO2020202353 A1 WO 2020202353A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
communication
state
robot
call
Prior art date
Application number
PCT/JP2019/014262
Other languages
English (en)
Japanese (ja)
Inventor
直秀 小川
薫 鳥羽
渉 瀬下
山川 浩
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to PCT/JP2019/014262 priority Critical patent/WO2020202353A1/fr
Priority to JP2021511727A priority patent/JP7212766B2/ja
Publication of WO2020202353A1 publication Critical patent/WO2020202353A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/42Systems providing special services or facilities to subscribers
    • H04M3/56Arrangements for connecting several subscribers to a common circuit, i.e. affording conference facilities

Definitions

  • the present invention relates to a communication robot, its control method, an information processing server, and an information processing method.
  • Patent Document 1 is premised on communication between the user and the robot, and it is considered that the robot intervenes in the user in order for the user and the remote user (remote user) to communicate with each other. Absent. Further, when a user and a remote user communicate with each other via a robot, the user may feel uncomfortable if the living space of one user is transmitted as it is. That is, in order for both users to communicate smoothly, the handling of transmitted information becomes an issue.
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide a technique capable of adjusting information transmitted when a user and a remote user communicate with each other.
  • a communication robot Identification means for identifying the state of the first user, A receiving means for receiving status information indicating the status of the second user who communicates with the first user remotely from an external device, and A determination means for determining whether communication between the first user and the second user is possible based on the state of the first user and the state of the second user.
  • the device includes a device used by the second user and a call control means for providing a call via the communication robot.
  • the call control means is an image obtained by modifying at least a part of an image taken by the first user or the second user, or at least one of voices picked up for the first user or the second user.
  • a communication robot is provided, which is characterized by making a call using a voice with a changed part.
  • FIG. 1 shows an example of the communication system which concerns on Embodiment 1 of this invention.
  • Block diagram showing a functional configuration example of the communication robot according to the first embodiment Block diagram showing an example of the software configuration of the communication robot according to the first embodiment
  • a flowchart showing a series of operations of the communication control process according to the first embodiment A flowchart showing a series of operations of the history processing according to the first embodiment.
  • FIG. 1 The figure which shows an example of the selection screen which selects the user state in which a communication robot intervenes according to Embodiment 1.
  • Block diagram showing a functional configuration example of the information processing server according to the second embodiment Block diagram showing an example of the software configuration of the communication robot according to the second embodiment
  • the present embodiment is not limited to this, and can be applied to communication of other persons such as when the target user and the remote user are friends.
  • the place where the robot and the child communicate with each other will be described assuming a predetermined space (for example, a space in the home such as a children's room or a shared space).
  • the place where the remote user stays will be described as a predetermined remote space (for example, a shared space in the home of the grandparents or a space where the remote user stays).
  • FIG. 1 shows an example of the communication system 10 according to the present embodiment.
  • the robot 100 and the target user 110 exist in a predetermined space, and the robot 100 interacts with the target user 110.
  • the robot 100 can communicate with the device 150 (for example, a mobile device) used by the remote user 160 via the network 120. That is, the target user 110 and the remote user 160 can communicate with each other via the robot 100 and the device 150.
  • the device 150 used by the remote user 160 is, for example, a smartphone, but is not limited to this, and may include a personal computer, a television, a tablet terminal, a smart watch, a game machine, and the like.
  • a movable communication robot exists instead of the device 150, and the target user 110 and the remote user 160 may communicate with each other via the two communication robots.
  • FIG. 2 is a block diagram showing a functional configuration example of the robot 100 according to the present embodiment.
  • the control unit 210 included in the robot 100 includes one or more CPUs (Central Processing Units) 211, an HDD (Hard Disk Drive) 212, and a RAM (Random Access Memory) 213.
  • the CPU 211 controls various processes shown below by reading and executing the program stored in the HDD 212.
  • the HDD 212 is a non-volatile storage area, and stores programs corresponding to various processes.
  • a semiconductor memory may be used instead of the HDD.
  • the RAM 213 is a volatile storage area and is used as, for example, a work memory.
  • the control unit 210 may be composed of a GPU (Graphics Processing Unit), an ASIC (Application Specific Integrated Circuit), a dedicated circuit, or the like.
  • the robot 100 includes various parts that serve as an interface for information with the outside. Each part shown below operates based on the control by the control unit 210.
  • the voice input unit 214 is a part for acquiring voice information from the outside, and includes, for example, a microphone. In the present embodiment, the voice input unit 214 acquires the utterance information of the target user in order to make a call, for example.
  • the image pickup unit 215 is a portion for acquiring image information of the user and the surroundings, and includes, for example, a camera. In the present embodiment, for example, an image of a target user is acquired in order to make a video call.
  • the power supply unit 216 is a part that supplies power to the robot 100 and corresponds to a battery.
  • the audio output unit 217 is a portion for outputting audio information to the outside, and includes, for example, a speaker and the like. In the present embodiment, the voice output unit 217 outputs, for example, the utterance of a remote user.
  • the image display unit 218 is a part for outputting image information, and includes a display and the like. In the present embodiment, the image display unit 218 displays an image of a remote user in order to make a moving image call, for example.
  • the operation unit 219 is a part for performing a user operation on the robot.
  • the image display unit 218 and the operation unit 219 may be collectively configured as, for example, a touch panel display that also functions as an input device.
  • the notification unit 220 is a part for notifying various information to the outside, and may be, for example, a light emitting unit such as an LED (Light Emitting Diode) or may be integrated with an audio output unit 217.
  • the drive unit 121 is a portion for moving the robot 100, and may include, for example, an actuator, a tire, a motor, an engine, and the like.
  • the sensor unit 222 includes various sensors for acquiring information on the external environment.
  • the sensor may include, for example, a temperature sensor, a humidity sensor, an infrared sensor, a depth sensor, a lidar (LiDAR: Light Detection and Ringing), and the like, and a sensor may be provided according to the information to be acquired.
  • the image pickup unit 215 may be included in the sensor unit 222.
  • the communication unit 223 is a part for communicating with an external device (for example, a device 150 used by a remote user or an external server) via the network 120, and the communication method, communication protocol, and the like are not particularly limited. Further, the communication unit 223 may include a GPS (Global Positioning System) for detecting its own position information.
  • GPS Global Positioning System
  • each part is shown by one block, but each part may be physically divided and installed at a plurality of places according to the function and structure of the robot 100.
  • the power supply unit 216 or the like may be configured to be detachable from the robot 100, or may be provided with an interface for adding a part for providing a new function.
  • the appearance of the robot 100 is not particularly limited, but may be configured according to, for example, a person (for example, a child) assumed as a communication target. Further, the material and size of the robot are not particularly limited.
  • FIG. 3 is a diagram showing an example of a software configuration of the robot 100 according to the present embodiment.
  • each processing unit is realized by the CPU 211 reading and executing the program stored in the HDD 212 or the like.
  • each DB database
  • the software configuration shows an exemplary configuration example of the present embodiment, and each software configuration such as firmware, OS, middleware, and framework is omitted.
  • the user state identification unit 301 is a part that identifies the user state of the target user based on the information from the sensor unit 222, the image pickup unit 215, and the voice input unit 214, and the information received from the outside.
  • the user state here will be described later with reference to FIG. It should be noted that not all user states need to be identified by the user state identification unit 301, and a part of the user state may be set externally via the communication unit 223 or via the operation unit 219. It may be set on the robot 100 side.
  • the user status DB 313 stores the user status table. The user status table will be described in detail with reference to FIG. 4, but the correspondence between the identified user status and the communication content between the target user and the remote user according to the user status is defined. There is.
  • the user state processing unit 302 specifies the communication content performed for the identified user state by using the user state identified by the user state identification unit 301 and the user state table stored in the user state DB 310. To do.
  • the user state identification unit 301 stores the user state of the target user, which is periodically identified, and the identified place and time as the action history of the target user in the action history DB 312.
  • the user state of the target user is periodically identified, if the user state shown in FIG. 4 cannot be identified, the user state is stored as "unidentified" or the like.
  • the map management unit 303 creates a map of the space in which the robot 100 operates and updates it periodically.
  • the map may be created based on the information acquired by the sensor unit 222 or the imaging unit 215 included in the robot 100, or may be created based on the location information acquired from the outside via the communication unit 223. May be good.
  • the map management unit 303 stores the created or updated map in the map information DB311.
  • the map management unit 303 provides map information according to the identified user state.
  • the communication processing unit 304 exchanges information with the outside. For example, the communication processing unit 304 receives state information indicating the state of the remote user, and transmits / receives data of a voice call or a video call for the target user and the remote user to communicate with each other. Alternatively, state information indicating the user state identified for the target user may be transmitted to the external device.
  • the remote user status acquisition unit 305 acquires status information indicating the user status of the remote user via the network 120. Further, the acquisition of the user state of the remote user may be performed periodically. The user state of the remote user acquired periodically may be stored in the action history DB 312 as the action history of the remote user.
  • the communication control unit 306 intervenes in the target user based on the user state of the target user identified by the user state identification unit 301 and the user state of the remote user acquired by the remote user state acquisition unit 305. Then, it is controlled to start communication (voice call or video call) between the target user and the remote user.
  • the process by the communication control unit 306 will be described later as a communication control process.
  • the action history processing unit 307 identifies the pattern of the user state of the target user based on the periodic user state of the target user stored in the action history DB 312 (that is, functions as a pattern identification means). Further, when the user state of the remote user is stored in the action history DB 312, the pattern of the user state of the remote user may be identified based on the user state of the remote user periodically.
  • the movement control unit 308 controls the robot to move to a place where communication is performed according to the identified user state or the like.
  • the robot of the present embodiment intervenes in the user when the target user is in a predetermined user state and communicates with the remote user so that the target user and the remote user can communicate with each other.
  • a predetermined user state As described above, the robot of the present embodiment intervenes in the user when the target user is in a predetermined user state and communicates with the remote user so that the target user and the remote user can communicate with each other.
  • An example of the user state considered by the robot to intervene in the target user will be described.
  • FIG. 4 shows a user state table 400 showing the user state according to the present embodiment and the content of communication associated with each user state.
  • the user state table 400 is composed of a user state 401, an occurrence timing 402, a communication content 403, and a communication location 404.
  • the user state 401 is a name for identifying the user state
  • the occurrence timing 402 indicates a timing for the robot 100 to identify the corresponding user state.
  • the communication content 403 shows an example of the content of the communication performed in the corresponding user state
  • the communication place 404 indicates the place where the corresponding communication is performed.
  • the user state in which "strong emotions are expressed" is a state in which the target user (child) is crying or yelling.
  • the robot 100 identifies this user state based on, for example, voice information.
  • the communication content associated with this user state is what the grandparents speak to (to a crying or yelling child). If grandparents, who are remote users, can intervene in the target user via a robot and talk to the target user, it will be an opportunity to calm the target user and talk honestly. The robot performs this communication at the current position of the target user (child).
  • the user state of "learning / practice start” is, for example, a state in which a child has started reading aloud, practicing a musical instrument, and practicing dancing.
  • the robot 100 identifies this state when it is input that the learning / practice has been started, for example, through the operation unit 219 of the robot 100 or the terminal of the parents of the child.
  • this state may be identified by the robot 100 recognizing the behavior of the child based on the voice information or the image information.
  • the content of communication for this user state is, for example, that the grandparents confirm the movements and utterances of the child.
  • grandparents who are remote users, can intervene in the target user via a robot to listen to reading aloud and watch the practice, it is possible to support the practice of the child on behalf of or in addition to the parent. it can.
  • the robot 100 performs this communication, for example, at the current position of the target user (child).
  • the user state "immediately before going to bed” is a state in which the target user enters the bed after taking a predetermined action before going to bed.
  • the robot 100 identifies the user state. ..
  • the content of communication for this user state is, for example, an utterance (reading aloud) by grandparents. If grandparents, who are remote users, can intervene in the target user via a robot and read aloud, the chances of reading aloud can be increased and the labor of parents can be reduced.
  • Robot 100 performs this communication in a shared space or a children's room.
  • the user state "immediately before going out” is when the target user takes a predetermined action before going out, and then heads to the entrance.
  • the robot 100 identifies that the child has taken a predetermined outing action such as brushing teeth and preparing a predetermined belonging at a predetermined time based on voice information or image information, the user state is determined. Identify.
  • the content of communication for this user state is, for example, a greeting sent by grandparents. If grandparents, who are remote users, can intervene in the target user via a robot and greet the sending out, it is possible to send out the child on behalf of the parent or in addition to the parent, such as cheering up a word.
  • the robot 100 performs this communication at the entrance.
  • the user status "immediately before returning home” is the state in which the target user has returned home to the vicinity of the home.
  • the robot 100 receives the GPS signal possessed by the target user at a predetermined frequency via the communication unit 223 to acquire the movement locus of the target user, and when the movement locus approaches a range of a predetermined distance from the home, this Identify the user state.
  • the content of communication for this user state is, for example, a greeting from the grandparents when they return home.
  • the robot 100 moves to the entrance and waits to intervene in the target user. If grandparents, who are remote users, can intervene in the target user via a robot and greet them, they can welcome the child instead of the parent even in the absence of the parent, reducing the loneliness of the child. be able to.
  • the robot 100 performs this communication at the entrance.
  • the user status may include, for example, "watching TV” or "expressing intention to communicate”.
  • the user state of "watching TV” is a state in which the target user is watching TV. This state is relatively relaxed and may allow grandparents, who are remote users, to come in and have a conversation. Further, when it is input that the robot 100 wants to communicate (at a predetermined time or the like) via the operation unit 219 of the robot 100 or the terminal of the child's parents, the robot 100 "displays the intention to communicate”. ”Identify the state.
  • the communication content in these states is such that the target user and the remote user have a normal conversation, for example.
  • the manifestation of intention to communicate can be set from, for example, the operation screen shown in FIG. 7 displayed on the terminal of the parents of the child.
  • the radio button 701 By pressing the radio button 701, it is set that communication is possible at the present time, or that communication is possible "from 20:00 to 21:00".
  • the information input at the terminals of the parents is received by the communication unit 223 of the robot 100, and then used for determining the user status.
  • the set time information may be transmitted to the device of the remote user.
  • the communicable state is shown as the user state. However, it may include a state in which communication is impossible, such as "mealing”, “bathing”, and “sleeping".
  • the user state identified by the robot may differ depending on the attributes of the target user. For example, depending on the age, the robot may be set to identify only the user states of "immediately before going out", “immediately before returning home", and "starting learning / practice".
  • the user state to be identified may be selected, for example, via the operation unit 219 of the robot 100 or the terminal of the parents of the child.
  • the user state to be processed by the robot 100 may be set by adding a check mark 801 to each of the listed user states 802. ..
  • the robot 100 acquires the user state of the remote user and considers the user state of the remote user.
  • the user state of the remote user is only a state in which communication is possible or a state in which communication is possible in a predetermined time zone.
  • Such a user state is set by an operation screen similar to the operation screen shown in FIG. 7 on the device 150 used by the remote user.
  • a robot similar to the robot 100 may exist on the remote user side, and the user state of the remote user may be identified by the robot. In this case, for example, when the remote user is identifying the state of watching TV, the remote user can communicate.
  • the robot 100 When the robot 100 according to the present embodiment considers the user state of the target user and the user state of the remote user and determines that communication is possible, the robot 100 provides the target user with information that proposes communication with the remote user. provide.
  • Information that proposes communication may be provided by presenting audio or images. These voices and images may include the name of the other party with whom the person communicates and information representing the content of the communication. If the user status is "immediately before going out", the information representing the communication content includes information such as "I have a greeting from my grandmother.” In addition, if the user status is "a strong emotion is expressed", the content differs depending on the user status, such as "grandmother will call out”.
  • the robot 100 intervenes in the target user in order to propose communication to the target user, the robot 100 approaches the target person, goes around the target person, and notifies a predetermined image or voice according to the user state. It may be notified by the unit 220.
  • the map management unit 303 of the robot 100 creates a map of the operating space by using the peripheral information acquired by the image pickup unit 215, the sensor unit 222, and the like. For example, the map may be updated by recollecting these peripheral information every time a certain period of time elapses. For example, when the arrangement of furniture is changed, the map information may be updated according to the state after the change. Further, the map management unit 303 may acquire information (map information, location information, etc.) from the outside via the communication unit 223 and perform mapping. Further, the map management unit 303 may be configured to detect an object existing in each area and associate the usage of the area.
  • the area may be associated with a bedroom, and the area with toys and desks may be associated with a children's room. If the purpose of the area cannot be specified, the configuration may be such that the setting is accepted by the user.
  • FIG. 5 is a flowchart showing a series of operations of the communication control process of the robot 100 according to the present embodiment.
  • this process is realized by the CPU 211 of the robot 100 reading and executing the program stored in the HDD 212.
  • Each processing step is realized, for example, in cooperation with the part of FIG. 2 and the processing unit of FIG. 3, but here, in order to simplify the explanation, the processing subject is comprehensively described as the robot 100.
  • the robot 100 monitors the status of the target user.
  • the state monitoring is performed in a state in which the target user acts within a range in which the target user can enter the field of view of the imaging unit 215 and a range in which the target user's action sound can be collected.
  • the robot 100 may collect information while moving so that the state monitoring of the target user can be performed more easily.
  • the robot 100 determines whether the user is in a predetermined user state.
  • the predetermined user state identified by the robot 100 is the user state described above with respect to FIG. 4, and the robot 100 identifies each user state by the respective methods described with reference to FIG. If the predetermined user state is identified (YES in S502), the robot 100 proceeds to S503, and if not (NO in S502), returns to S501 and continues the state monitoring.
  • the robot 100 acquires the user state of the remote user.
  • the robot 100 may, for example, transmit state information indicating the user state identified in S502 to the device 150 used by the remote user, or transmit request information requesting the user state of the remote user to the device 150. You may.
  • the robot 100 acquires the state information indicating the user state of the remote user, which is transmitted by the device 150 of the remote user in response to the above information.
  • the robot 100 determines whether communication between the target user and the remote user is possible based on the user state of the identified target user and the user state of the remote user. For example, the robot 100 determines that the target user and the remote user can communicate with each other when the identified user state of the target user and the user state of the remote user are both communicable user states. In the example of this embodiment, the user state of the target user shown in FIG. 4 is a communicable state. Therefore, if the remote user is set to be able to communicate on the operation screen shown in FIG. 7, it is determined that the target user and the remote user can communicate with each other. If it is determined that communication is possible (YES in S504), the robot 100 proceeds to S505, and if not (NO in S504), the robot 100 ends this process.
  • the robot 100 moves to a place where communication is performed. For example, the robot 100 moves to the place where the target user is when the target user is not nearby. Further, when the communication location of the identified user state is a specific location (that is, it is not the current position of the target user), the communication location is moved to the specific location. For example, when the user state of the identified target user is "immediately before going out” or “immediately before returning home", the robot 100 moves to the "entrance" which is a communication place. That is, the robot 100 can move to a different communication location according to the user state of the identified target user.
  • the robot 100 proposes communication to the target user. Specifically, the robot 100 displays information proposing communication on the image display unit 218 or outputs it as voice from the voice output unit 217 in response to the target user approaching a predetermined distance or less. As described above, specific communication proposals include different communication contents depending on the user status.
  • the robot 100 waits to receive a response to a proposal from the target user. For example, the robot 100 receives either "communicate” or “do not communicate” as a response to the proposal by touching the touch panel or by voice.
  • the robot 100 determines whether the target user has accepted the communication proposal.
  • the response to the proposal is "communicate”
  • the robot 100 determines that the target user has accepted the proposal (YES in S507) and proceeds to S508.
  • the response to the proposal is "no communication” (or may be a timeout) (NO in S507), this process ends.
  • the robot 100 transmits a call start request to the remote user's device 150, and waits for a response to the call start request from the device 150.
  • the remote user's device 150 receives the call start request, it notifies the remote user of the arrival of the call start request (that is, intervenes in the remote user) using voice information or the like, and waits for the remote user to accept the call start.
  • the call start request transmitted by the robot 100 may include the user state of the target user and the communication content. Therefore, the device 150 of the remote user presents the user status and communication contents of the target user, proposes communication, and then accepts the selection of "communicate" or "not communicate” from the remote user. In this case, the remote user can start communication after confirming the user status of the target user.
  • the robot 100 when the robot 100 receives the response to the call start request from the device 150, it determines whether the call start request is accepted by the device 150 side. The robot 100 proceeds to S510 when the response is a response that accepts the call start request (YES in S509), and ends this process when the response is not a response that accepts the call start request (NO in S509). To do.
  • the robot 100 starts a voice call or a video call between the target user and the remote user, and the communication content according to the user state is executed.
  • the robot 100 uses a voice call or a video call input from the image pickup unit 215, the voice input unit 214, and the device 150 used by the remote user.
  • the target user can set in advance from the operation unit 219 to change the background image in which the private space is reflected (the background is erased or replaced with another image).
  • the robot 100 changes the background of the moving image transmitted in the execution of the communication content, and then transmits the changed moving image to the remote user side. For example, as shown in FIG. 15, when a user at the entrance is photographed as it is, the appearance 1503 of the entrance is reflected as it is in the background of the user 1502 as shown in the image 1501. There are situations where shoes and other luggage are placed at the entrance and you do not want others to see you. Therefore, the robot 100 changes the image 1502 and transmits the image 1504.
  • This image 1504 is an image in which the background is deleted or replaced with another image.
  • the change of the transmitted moving image is not limited to the change of the background image, and the user area may be changed to the avatar.
  • the robot 100 may recognize the movement of the target user from the image captured by the target user and reflect it in the movement of the avatar.
  • the transmission image is not limited to the change, and the voice spoken by the target user may be converted into another voice, or the ambient sound may be reduced and transmitted. By doing so, it is possible to appropriately adjust the elements that the user feels uncomfortable if the living space of the target user is transmitted as it is. That is, it is possible to adjust the transmitted information to make the communication between the target user and the remote user smoother and more comfortable.
  • the target user may be able to set different background and avatar settings for each location or for each communication content. In this way, the target user can optimize the degree of exposure of the target user's information for each space or communication of concern.
  • the audio or moving image transmitted from the remote user side may be changed according to the setting by the remote user.
  • the background of a moving image taken by a remote user may be changed, or the sound may be changed.
  • the received audio or moving image may be modified and presented, or the robot 100 may change the audio or video on the device side used by the remote user.
  • the image may be received and presented.
  • the robot 100 determines whether the call is terminated. For example, when the end operation is received from the operation unit 219 of the robot 100, it is determined that the call is ended and the present process is ended. If not, the call in S510 is continued.
  • FIG. 6 is a flowchart showing history processing in the robot 100 of the present embodiment.
  • this process is realized by the CPU 211 of the robot 100 reading and executing the program stored in the HDD 212.
  • Each processing step is realized, for example, in cooperation with the part of FIG. 2 and the processing unit of FIG. 3, but here, in order to simplify the explanation, the processing subject is comprehensively described as the robot 100.
  • the robot 100 acquires the behavior history information of the target user to be analyzed for a period of, for example, 2 weeks, 1 month, 6 months, etc. from the behavior history DB 312.
  • the robot 100 extracts the behavior pattern of the target user in, for example, a time zone.
  • the robot 100 extracts, for example, a pattern in which user states such as "immediately before going to bed”, “immediately before going out”, and “immediately before returning home” are identified near a specific time and at a specific place, respectively.
  • the robot 100 stores a specific place and a specific place of the extracted action pattern in the action history DB 312 as information for predicting the occurrence timing and the communication place of a predetermined user state.
  • the robot 100 may refer to the stored information and change the location for monitoring the user status of S501 described above so that the user status can be determined more reliably. That is, it becomes possible to anticipate the occurrence of a user state, anticipate an appropriate place, and more accurately intervene in the user (suggest communication).
  • the robot 100 may simplify the start of communication when a specific time and a specific place are extracted for the user state such as "immediately before going out” and “immediately before returning home”. That is, if the robot 100 moves to the communication location at the scheduled time and the user states of the target user and the remote user can communicate with each other, the confirmation of acceptance of the proposal of S507 and acceptance of the call of S509 is omitted. You may start communication. In this way, it becomes possible to easily start communication such as exchanging a word of greeting.
  • FIG. 9 shows an example of the communication system 90 according to the present embodiment.
  • the information processing server 900 is added.
  • the robot 100 transmits / receives data to / from the information processing server 900 via the network 120.
  • the device 150 used by the remote user 160 also transmits / receives data to / from the information processing server 900.
  • FIG. 10 is a block diagram showing a functional configuration example of the information processing server 900.
  • the control unit 1010 includes one or more CPUs (Central Processing Units) 1011s, an HDD (Hard Disk Drive) 1012, and a RAM (Random Access Memory) 1013.
  • the CPU 1011 controls various processes shown below by reading and executing the program stored in the HDD 1012.
  • the HDD 1012 is a non-volatile storage area, and stores programs corresponding to various processes.
  • a semiconductor memory may be used instead of the HDD.
  • the RAM 1013 is a volatile storage area and is used as, for example, a work memory.
  • the control unit 1010 may be composed of a GPU (Graphics Processing Unit), an ASIC (Application Specific Integrated Circuit), a dedicated circuit, or the like. Further, each component of the control unit 1010 may have a virtualized configuration.
  • the power supply unit 1014 is a portion that supplies power to the information processing server 900 from the outside.
  • the communication unit 1015 is a part for communicating with the robot 100 and the device 150 used by the remote user via the network 120, and the communication method, communication protocol, and the like are not particularly limited.
  • FIG. 11 is a diagram showing an example of a software configuration of the robot 100 according to the present embodiment.
  • each processing unit is realized by the CPU 211 reading and executing the program stored in the HDD 212 or the like.
  • each DB database
  • the software configuration shows only the configuration examples necessary for the implementation of this embodiment, and each software configuration such as firmware, OS, middleware, and framework is omitted.
  • the user state identification unit 301 is a part that identifies the user state of the target user based on the information from the sensor unit 222, the image pickup unit 215, and the voice input unit 214, and the information received from the outside.
  • the user state is the same as that described above in FIG. It should be noted that not all user states need to be identified by the user state identification unit 301, and a part of the user state may be set externally via the communication unit 223 or via the operation unit 219. It may be set on the robot 100 side.
  • the user status DB 313 stores the user status table.
  • the user state DB 313 which is the same as that of the embodiment, is synchronized with the user state DB of the information processing server 900 when updated by the robot 100, and substantially the same data is held.
  • the user state identified by the user state identification unit 301 is transmitted to the information processing server 900 as state information indicating the user state.
  • the user state identification unit 301 stores the user state of the target user, which is periodically identified, and the identified place and time as the action history of the target user in the action history DB 312.
  • the user state of the target user is periodically identified, if the user state shown in FIG. 4 cannot be identified, the user state is stored as "unidentified" or the like.
  • the communication processing unit 304 exchanges information with the outside. For example, the communication processing unit 304 transmits / receives data exchanged with the information processing server 900, and transmits / receives data of a voice call or a video call for communication between the target user and the remote user.
  • the simple communication control unit 1101 receives information proposing communication from the information processing server 900 and intervenes in the target user. Then, it is controlled to start communication (voice call or video call) between the target user and the remote user. The process by the simple communication control unit 1101 will be described later as a simple communication control process.
  • the map management unit 303, the action history processing unit 307, the movement control unit 308, and each DB are substantially the same as those in the first embodiment.
  • Each DB is synchronized with the user state DB 1210, the map information DB 1211, and the action history DB 1212 on the information processing server 900 side, and substantially the same data is held.
  • FIG. 12 is a diagram showing an example of a software configuration of the information processing server 900 according to the present embodiment.
  • each processing unit is realized by the CPU 1011 reading and executing the program stored in the HDD 1012 or the like.
  • each DB database
  • the software configuration shows only the configuration examples necessary for the implementation of this embodiment, and each software configuration such as firmware, OS, middleware, and framework is omitted.
  • the user state acquisition unit 1201 is a part that acquires state information indicating the user state from the robot 100. Even when a part of the user state is set from the outside in the robot 100, the state information indicating the set user state is transmitted to the information processing server 900.
  • the user state DB 1210 stores the user state table.
  • the user state DB 1210 is synchronized with the user state DB of the robot 100.
  • the user state processing unit 1202 specifies the communication content performed for the user state by using the user state of the target user acquired by the user state acquisition unit 1201 and the user state table stored in the user state DB 1210. To do.
  • the communication processing unit 1204 exchanges information with the robot 100 and the device 150 used by the remote user. For example, the communication processing unit 1204 receives state information indicating the state of the remote user, and transmits / receives data of a voice call or a video call for the target user and the remote user to communicate with each other. Alternatively, state information indicating the user state identified for the target user may be transmitted to the external device.
  • the remote user status acquisition unit 1205 acquires status information indicating the user status of the remote user via the network 120.
  • the communication control unit 1206 causes the robot to intervene in the target user based on the user state of the target user identified by the user state acquisition unit 1201 and the user state of the remote user acquired by the remote user state acquisition unit 1205. Control the robot 100. Then, the robot 100 is controlled so as to start communication (voice call or video call) between the target user and the remote user.
  • the processing by the communication control unit 1206 will be described later as a communication control processing.
  • the information processing server 900 determines whether the target user and the remote user can communicate with each other, and then causes the robot 100 to intervene in the target user to cause the remote user. Propose communication with. However, the user state of the target user identified by the robot 100 is the same as that of the first embodiment.
  • the information processing server 900 acquires the user state of the remote user and considers the user state of the remote user.
  • the user state of the remote user is only a state in which communication is possible or a state in which communication is possible in a predetermined time zone.
  • Such a user state is set by an operation screen similar to the operation screen shown in FIG. 7 on the device 150 used by the remote user.
  • a robot similar to the robot 100 may exist on the remote user side, and the user state of the remote user may be identified by the robot.
  • Method of proposing communication When the information processing server 900 according to the present embodiment determines that communication is possible in consideration of the user state of the target user and the user state of the remote user, the robot 100 informs the target user of the remote user. Send information that suggests communication. The robot 100 provides information that proposes communication with a remote user to the target user based on the information received from the information processing server 900.
  • the information proposing communication is provided as in Embodiment 1.
  • FIG. 13 is a flowchart showing a series of operations of the communication control process in the information processing server 900 according to the present embodiment.
  • this process is realized by the CPU 1011 of the information processing server 900 reading and executing the program stored in the HDD 1012.
  • Each processing step is realized, for example, in cooperation with the part of FIG. 10 and the processing unit of FIG. 12, but here, in order to simplify the explanation, the processing subject is comprehensively described as the information processing server 900. ..
  • the information processing server 900 acquires state information indicating the user state of the target user identified by the robot 100.
  • the information processing server 900 determines whether the acquired user state is a predetermined user state.
  • the predetermined user state is the user state described above with respect to FIG.
  • the information processing server 900 proceeds to S1303 when it determines that the acquired user state is a predetermined user state (YES in S1302), and returns to S1301 when it does not (NO in S1302).
  • the information processing server 900 acquires the user status of the remote user.
  • the information processing server 900 may, for example, transmit the state information indicating the user state identified in S1302 to the device 150 used by the remote user, or send the request information requesting the user state of the remote user to the device 150. May be sent to.
  • the information processing server 900 acquires the state information indicating the user state of the remote user, which is transmitted by the device 150 of the remote user in response to the above information.
  • the information processing server 900 determines whether communication between the target user and the remote user is possible based on the acquired user state of the target user and the user state of the remote user. For example, the information processing server 900 determines that the target user and the remote user can communicate with each other when the user state of the target user and the user state of the remote user are both communicable user states. In the example of this embodiment, the user state of the target user shown in FIG. 4 is a communicable state. Therefore, if the remote user is set to be able to communicate on the operation screen shown in FIG. 7, it is determined that the target user and the remote user can communicate with each other. The information processing server 900 proceeds to S1305 when it is determined that communication is possible (YES in S1304), and ends this process when it is not (NO in S1304).
  • the information processing server 900 transmits the proposal information for the robot 100 to propose communication to the target user to the robot 100.
  • the proposal information includes communication contents that differ depending on the user status.
  • the information processing server 900 determines whether the target user has accepted the communication proposal. For example, the information processing server 900 receives response information indicating a response by the target user from the robot 100. Then, when the response information is "communicate", it is determined that the target user has accepted the proposal (YES in S1306), and the process proceeds to S1307. On the other hand, when the response to the proposal is "no communication" (or may be a timeout) (NO in S1306), this process ends.
  • the information processing server 900 transmits a call start request to the remote user's device 150, and waits for a response to the call start request from the device 150.
  • the remote user's device 150 receives the call start request, it notifies the arrival of the call start request (that is, intervenes in the remote user) using voice information or the like, and waits for the remote user to accept the call start.
  • the call start request transmitted by the information processing server 900 may include the user status of the target user and the communication content. Therefore, the device 150 of the remote user presents the user status and communication contents of the target user, proposes communication, and then accepts the selection of "communicate" or "not communicate” from the remote user. In this case, the remote user can start communication after confirming the user status of the target user.
  • the information processing server 900 determines whether the call start request is accepted on the device 150 side. If the response is a response that accepts the call start request (YES in S1308), the information processing server 900 proceeds to S1309, and if the response is not a response that accepts the call start request (NO in S1308), the processing is performed. To finish.
  • the information processing server 900 starts a voice call or a video call between the target user and the remote user, and the communication content according to the user state is executed.
  • the information processing server 900 uses a voice call or a video call input from the image pickup unit 215 and the voice input unit 214 of the robot 100 and the device 150 used by the remote user.
  • the information processing server 900 can change the background image (erasing the background or replacing it with another image) in which the private space of the target user is reflected.
  • the information processing server 900 acquires the setting information preset by the operation unit 219 of the robot 100 from the robot 100 and changes the background image.
  • the change of the background image may be the same as the change described with reference to FIG.
  • the information processing server 900 changes the background of the moving image transmitted from the robot 100 and then transmits the changed moving image to the remote user side.
  • the change of the transmitted moving image is not limited to the change of the background image, and the user area may be changed to avatar.
  • the information processing server 900 may recognize the movement of the target user from the image captured by the target user and reflect it in the movement of the avatar.
  • the transmission image is not limited to the change, and the voice spoken by the target user may be converted into another voice, or the ambient sound may be reduced and transmitted. By doing so, it is possible to appropriately adjust the elements that the user feels uncomfortable if the living space of the target user is transmitted as it is. That is, it is possible to adjust the transmitted information to make the communication between the target user and the remote user smoother and more comfortable.
  • the target user may be able to set different background and avatar settings for each location or for each communication content. In this way, the target user can optimize the degree of exposure of the target user's information for each space or communication of concern.
  • the audio or moving image transmitted from the remote user side may be changed according to the setting by the remote user.
  • the background of a moving image taken by a remote user may be changed, or the sound may be changed.
  • the information processing server 900 may receive the changed voice or moving image on the device side used by the remote user and transmit it to the robot 100.
  • the information processing server 900 determines whether the call is terminated. For example, when a termination request is received from the robot 100, it is determined that the call is terminated and the present process is terminated. If not, the call in S1309 is continued.
  • FIG. 14 is a flowchart showing a series of operations of the simple communication control process in the robot 100 according to the present embodiment.
  • this process is realized by the CPU 211 of the robot 100 reading and executing the program stored in the HDD 212.
  • Each processing step is realized, for example, in cooperation with the part of FIG. 2 and the processing unit of FIG. 3, but here, in order to simplify the explanation, the processing subject is comprehensively described as the robot 100.
  • the robot 100 monitors the status of the target user.
  • the robot 100 transmits state information indicating the user state to the information processing server 900.
  • the robot 100 receives the proposal information from the information processing server 900. That is, it means that the information processing server 900 determines that the target user and the remote user can communicate with each other.
  • the robot 100 moves to a place where communication is performed. For example, the robot 100 moves to the place where the target user is when the target user is not nearby. Further, when the communication location of the identified user state is a specific location (that is, it is not the current position of the target user), the communication location is moved to the specific location. For example, when the user state included in the proposal information is "immediately before going out” or “immediately before returning home", the robot 100 moves to the "entrance" which is a communication place. That is, the robot 100 can move to a different communication location according to the user state of the identified target user.
  • the robot 100 proposes communication to the target user based on the proposal information from the information processing server 900. Specifically, the robot 100 displays information proposing communication on the image display unit 218 or outputs it as voice from the voice output unit 217 in response to the target user approaching a predetermined distance or less. As described above, specific communication proposals include different communication contents depending on the user status.
  • the robot 100 waits to receive a response to a proposal from the target user. The robot 100 receives either "communicate” or "does not communicate” as a response to the proposal by touching the touch panel or by voice.
  • the robot 100 transmits the response of the target user to the communication proposal to the information processing server 900.
  • the robot 100 receives a call start request from the information processing server 900.
  • the robot 100 starts a voice call or a video call between the target user and the remote user, and the communication content according to the user state is executed.
  • the robot 100 determines whether the call is terminated. For example, when the end operation is received from the operation unit 219 of the robot 100, it is determined that the call is ended, and the call end request is transmitted to the information processing server 900 to end this process. If not, the call in S1408 is continued.
  • the communication robot (for example, 100) of the above embodiment is An identification means (for example, 301) for identifying the state of the first user, and Receiving means (for example, 304, 305) that receives state information indicating the state of the second user who communicates with the first user remotely from an external device, and A determination means (for example, 306) for determining whether communication between the first user and the second user is possible based on the state of the first user and the state of the second user.
  • a call control means for example, 306 that provides a call via the device used by the second user and the communication robot.
  • the call control means is an image obtained by modifying at least a part of an image taken by the first user or the second user, or at least one of voices picked up for the first user or the second user. Make a call using the voice with the changed part.
  • the communication robot of the above embodiment The call control means makes a call using an image in which the background of the image taken by the first user is changed (for example, 306).
  • the call control means makes a call using a voice obtained by changing the ambient sound of the voice picked up for the first user (for example, 306).
  • the communication between the first user and the second user is a communication in which the first user returns home by the second user or is sent by the first user (FIG. 4).
  • the child when a grandparent who is a remote user talks to the target user via a robot, the child can be welcomed or sent out in place of the parent even in the absence of the parent, and the child's loneliness. Can be reduced.
  • the communication between the first user and the second user is a communication in which the second user speaks to the crying first user (FIG. 4).
  • the grandparents who are remote users, talk to the target user via the robot, which calms the target user's feelings and gives an opportunity to talk honestly.
  • the communication between the first user and the second user is a communication in which the second user confirms a movement or an utterance performed by the first user (FIG. 4).
  • the grandparents who are remote users can support the practice of the child on behalf of the parent or by joining the parent by listening to the reading aloud through the robot and watching the practice. ..
  • the communication between the first user and the second user is a communication in which the first user listens to an utterance by the second user (FIG. 4).
  • grandparents who are remote users can read aloud via a robot to increase the chances of reading aloud and reduce the labor of parents.
  • the communication robot of the above embodiment Further having a movement control means (for example, 308) for moving the communication robot to a place where the first user communicates via the communication robot.
  • the movement control means moves the communication robot to the place based on the current position or action pattern of the first user.
  • the communication robot can autonomously move to a place where a call is provided to the target user.
  • the communication robot of the above embodiment A storage means for storing information about the state of the first user, which is periodically collected, and Further having a pattern identification means (for example, 307) for identifying the behavior pattern of the first user based on the information about the state of the first user collected periodically.
  • the movement control means moves the communication robot to the place based on the identified pattern of the state of the first user.
  • the communication robot can autonomously move to a place where a call is provided to a target user based on a pattern of a user state that occurs periodically.
  • the first user is a child.
  • the second user is the grandparent of the first user.
  • the communication robot of the above embodiment The device used by the second user is a robot capable of identifying the state of the second user.
  • the user state of the remote user can also be identified by the robot.
  • the control method of the communication robot (for example, 100) in the above embodiment is An identification step (for example, S502) for identifying the state of the first user, A receiving step (for example, S503) of receiving state information indicating the state of the second user who communicates with the first user remotely from an external device, and A determination step (for example, S504) for determining whether communication between the first user and the second user is possible based on the state of the first user and the state of the second user.
  • a call control step for example, S506 that provides a call via the device used by the second user and the communication robot when it is determined that communication between the first user and the second user is possible.
  • a call control step for example, S506 that provides a call via the device used by the second user and the communication robot when it is determined that communication between the first user and the second user is possible.
  • a call control step for example, S506 that provides a call via the device used by the second user and the communication robot when it is determined that communication between the first user and the second user
  • the information processing server (for example, 900) in the above embodiment is A first acquisition means (for example, 1201) for acquiring information about the state of the first user from a communication robot capable of imaging, and A second acquisition means (for example, 1205) for acquiring information about the state of the second user from a device used by the second user who communicates with the first user remotely.
  • a determination means for example, 1206) for determining whether communication between the first user and the second user is possible based on the state of the first user and the state of the second user.
  • a call control means for example, 1204 that provides a call via the device used by the second user and the communication robot.
  • the call control means is an image obtained by modifying at least a part of an image taken by the first user or the second user, or at least one of voices picked up for the first user or the second user. Make a call using the voice with the changed part.
  • an information processing server capable of appropriately adjusting the information transmitted when the user and the remote user communicate with each other.
  • the information processing method executed by the information processing server in the above embodiment is The first acquisition step (for example, S1301) of acquiring information about the state of the first user from the communication robot capable of imaging, and A second acquisition step (for example, S1303) of acquiring information about the state of the second user from a device used by the second user who remotely communicates with the first user.
  • a determination step for example, S1304) for determining whether communication between the first user and the second user is possible based on the state of the first user and the state of the second user.
  • a call control step (for example, S1309) that provides a call via the device used by the second user and the communication robot when it is determined that communication between the first user and the second user is possible. Have, In the call control step, at least a modified image of at least a part of the image taken by the first user or the second user, or at least one of the voices picked up for the first user or the second user. Make a call using the voice with the changed part.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)
  • Telephone Function (AREA)

Abstract

L'invention concerne un robot de communication comprenant : un moyen d'identification qui identifie l'état d'un premier utilisateur; un moyen de réception qui reçoit, en provenance d'un dispositif externe, des informations d'état indiquant l'état d'un second utilisateur qui communique à distance avec le premier utilisateur; un moyen de détermination qui détermine, sur la base de l'état du premier utilisateur et de l'état du second utilisateur, si une communication entre le premier utilisateur et le second utilisateur est possible; et un moyen de commande d'appel qui, lorsqu'une communication entre le premier utilisateur et le second utilisateur est déterminée comme étant possible, fournit un appel via le robot de communication et un dispositif utilisé par le second utilisateur. Le moyen de commande d'appel réalise l'appel en utilisant une image obtenue en changeant au moins une partie d'une image capturée du premier utilisateur ou du second utilisateur ou de la voix obtenue en changeant au moins une partie de la voix collectée pour le premier utilisateur ou le second utilisateur.
PCT/JP2019/014262 2019-03-29 2019-03-29 Robot de communication, son procédé de commande, serveur de traitement d'informations et procédé de traitement d'informations WO2020202353A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2019/014262 WO2020202353A1 (fr) 2019-03-29 2019-03-29 Robot de communication, son procédé de commande, serveur de traitement d'informations et procédé de traitement d'informations
JP2021511727A JP7212766B2 (ja) 2019-03-29 2019-03-29 コミュニケーションロボットおよびその制御方法、情報処理サーバならびに情報処理方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/014262 WO2020202353A1 (fr) 2019-03-29 2019-03-29 Robot de communication, son procédé de commande, serveur de traitement d'informations et procédé de traitement d'informations

Publications (1)

Publication Number Publication Date
WO2020202353A1 true WO2020202353A1 (fr) 2020-10-08

Family

ID=72666637

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/014262 WO2020202353A1 (fr) 2019-03-29 2019-03-29 Robot de communication, son procédé de commande, serveur de traitement d'informations et procédé de traitement d'informations

Country Status (2)

Country Link
JP (1) JP7212766B2 (fr)
WO (1) WO2020202353A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0476698A (ja) * 1990-07-13 1992-03-11 Toshiba Corp 画像監視装置
WO1999067067A1 (fr) * 1998-06-23 1999-12-29 Sony Corporation Robot et systeme de traitement d'information
JP2013141246A (ja) * 2011-12-29 2013-07-18 Samsung Electronics Co Ltd 映像装置及びその制御方法
JP2016184980A (ja) * 2016-07-26 2016-10-20 カシオ計算機株式会社 通信装置およびプログラム
JP2018092528A (ja) * 2016-12-07 2018-06-14 国立大学法人電気通信大学 チャットシステム、管理装置、端末装置、宛先選択支援方法および宛先選択支援プログラム
WO2018163356A1 (fr) * 2017-03-09 2018-09-13 株式会社 資生堂 Dispositif et programme de traitement d'informations

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0476698A (ja) * 1990-07-13 1992-03-11 Toshiba Corp 画像監視装置
WO1999067067A1 (fr) * 1998-06-23 1999-12-29 Sony Corporation Robot et systeme de traitement d'information
JP2013141246A (ja) * 2011-12-29 2013-07-18 Samsung Electronics Co Ltd 映像装置及びその制御方法
JP2016184980A (ja) * 2016-07-26 2016-10-20 カシオ計算機株式会社 通信装置およびプログラム
JP2018092528A (ja) * 2016-12-07 2018-06-14 国立大学法人電気通信大学 チャットシステム、管理装置、端末装置、宛先選択支援方法および宛先選択支援プログラム
WO2018163356A1 (fr) * 2017-03-09 2018-09-13 株式会社 資生堂 Dispositif et programme de traitement d'informations

Also Published As

Publication number Publication date
JP7212766B2 (ja) 2023-01-25
JPWO2020202353A1 (ja) 2021-12-09

Similar Documents

Publication Publication Date Title
AU2014236686B2 (en) Apparatus and methods for providing a persistent companion device
US9386147B2 (en) Muting and un-muting user devices
WO2016052018A1 (fr) Système de gestion d'appareil ménager, appareil ménager, dispositif de commande à distance et robot
JP2017009867A (ja) 制御装置、その方法及びプログラム
JP2017010176A (ja) 機器特定方法、機器特定装置及びプログラム
JPWO2015133022A1 (ja) 情報処理装置、情報処理方法およびプログラム
KR102463806B1 (ko) 이동이 가능한 전자 장치 및 그 동작 방법
WO2017141530A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP7102169B2 (ja) 装置、ロボット、方法、及びプログラム
JP7031578B2 (ja) 情報処理装置、情報処理方法およびプログラム
CN110178159A (zh) 具有集成式投影仪的音频/视频可穿戴式计算机系统
WO2020202353A1 (fr) Robot de communication, son procédé de commande, serveur de traitement d'informations et procédé de traitement d'informations
CN114930795A (zh) 用于减少音频反馈的方法和系统
WO2020202354A1 (fr) Robot de communication, son procédé de commande, serveur de traitement d'informations et procédé de traitement d'informations
JP2019208167A (ja) テレプレゼンスシステム
JP2023130822A (ja) 機器システム、撮像装置、表示方法
JP2023131635A (ja) 表示システム、表示方法、撮像装置、プログラム
US11936718B2 (en) Information processing device and information processing method
JPWO2021229692A5 (fr)
JPWO2015174112A1 (ja) 端末装置、システム、情報提示方法およびプログラム
WO2024070550A1 (fr) Système, dispositif électronique, procédé de commande de système, et programme
US11216233B2 (en) Methods and systems for replicating content and graphical user interfaces on external electronic devices
JP7279706B2 (ja) 情報処理装置、情報処理システム、情報処理方法及び情報処理プログラム
US20230364800A1 (en) Virtual and physical social robot with humanoid features
CN112060084A (zh) 一种智能互动系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19923105

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021511727

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19923105

Country of ref document: EP

Kind code of ref document: A1