WO2020075647A1 - Dispositif de traitement d'informations - Google Patents

Dispositif de traitement d'informations Download PDF

Info

Publication number
WO2020075647A1
WO2020075647A1 PCT/JP2019/039355 JP2019039355W WO2020075647A1 WO 2020075647 A1 WO2020075647 A1 WO 2020075647A1 JP 2019039355 W JP2019039355 W JP 2019039355W WO 2020075647 A1 WO2020075647 A1 WO 2020075647A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
information
avatar
message
service
Prior art date
Application number
PCT/JP2019/039355
Other languages
English (en)
Japanese (ja)
Inventor
修 豊崎
Original Assignee
株式会社豊崎会計事務所
ベストライト・インヴェストメント・リミテッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社豊崎会計事務所, ベストライト・インヴェストメント・リミテッド filed Critical 株式会社豊崎会計事務所
Priority to JP2020511828A priority Critical patent/JP7002085B2/ja
Publication of WO2020075647A1 publication Critical patent/WO2020075647A1/fr
Priority to JP2021203938A priority patent/JP7468850B2/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • the present invention relates to an information processing device.
  • Patent Document 1 There is already a system that supports the process of establishing a marriage by enriching the knowledge necessary for a conversation between men and women on a date and supporting the conversation (see Patent Document 1). Further, there is already a memo pad for replacement for the purpose of enabling both a passive person and an active person to have a conversation with an intended partner equally (see Patent Document 2). In addition, there is already a server device that aims to support the user's marriage activity to proceed more smoothly (see Patent Document 3).
  • the unsuccessful matching is often caused by any of the following four problems. That is, matching may fail due to at least one of (1) communication ability problem, (2) appearance problem, (3) residence problem, and (4) time problem. Many. That is, (1) is a problem that matching is not established due to a lack of communication ability, a misunderstanding not being transmitted to the other party, or a dark personality. (2) is a problem that matching fails because the appearance of the other party deviates from his / her ideal. (3) is a problem that matching fails because the other party's residence is too far away. (4) is a problem that matching fails because the other person's living time does not match his / her own living time.
  • the present invention has been made in view of such a situation, and contributes to improvement of a user's real life by solving the problem of communication ability, the problem of appearance, the problem of residence, and the problem of time.
  • the purpose is to provide a method.
  • an information processing device of one embodiment of the present invention is Acquisition means for acquiring information about the user as user information, Message management means for managing messages sent and received interactively via a user terminal operated by the user, In an information processing device including The message management means, When there is a message from the user, a message to be transmitted to the user is generated based on the content of the message and the user information, and when there is no message from the user, based on the user information. And generate a message to convey to the user.
  • a space management unit that generates a virtual space accessible via the user terminal and manages the generated virtual space
  • Avatar management means for managing one or more avatars that can exist in the virtual space
  • the avatar management means may generate a first avatar whose main subject is the user and a second avatar whose subject is completely created.
  • the present invention by solving the problem of communication ability, the problem of appearance, the problem of residence, and the problem of time, it is possible to contribute to the improvement of the real life of the user.
  • FIG. 2 is a block diagram illustrating a hardware configuration of the management server in FIG. 1. It is a functional block diagram which shows an example of a functional structure of the management server of FIG.
  • FIG. 1 is an image diagram showing an outline of this service provided by the service provider P.
  • the present service when it is determined that the cause of unsuccessful matching in the male and female matching service is at least one of the following four problems, various types of solutions for the target user U to eliminate the cause are provided. It provides sub-services. That is, when the cause of the failure of matching is (1) communication ability problem, (2) appearance problem, (3) residence problem, and (4) time problem, Various sub-services are provided to solve this.
  • Dialogue service When the unsuccessful matching is caused by the problem of the communication ability described in (1), the target user U is provided with the “dialogue service” which is a sub-service of this service.
  • the interactive service is a service in which the target user U can interact with the concierge C or the advisor D via the user terminal.
  • dialogue means exchanging messages between a plurality of persons, and “via a user terminal” means message input processing and message output processing in a smartphone or the like operated by the user U. And to do.
  • “Inputting a message” means inputting voice with a microphone or inputting with a touch panel to a smartphone or the like operated by the user U.
  • “Outputting a message” means outputting a message from the concierge C, the advisor D, or the like from the speaker or screen of the smartphone or the like operated by the user U.
  • “Concierge” C interacts with user U in a position like a so-called caretaker. For example, in response to the message “I'm hungry" of the user U, the concierge C returns a message “It's about lunch”. Then, when the user U inputs the message "Eat meat?", The concierge C returns the message “Eating out? Delivery”. On the other hand, when the user U inputs a message “Would you like to eat out?", The concierge C returns a message "Would you like to reserve somewhere?" And automatically searches for and reserves a nearby restaurant.
  • Advisor D like Concierge C, interacts with user U, but in addition to this, interacts with user U from the standpoint of an advisor who gives advice and lectures. For example, Advisor D sends a message that also serves as advice, such as "I ate meat yesterday, so I wonder if I should eat fish today.” On the other hand, when the user U inputs the message “Well, book a restaurant that considers calories and insufficient nutrients", the advisor D asks "OK. Restaurant at 12:30. I made a reservation for ⁇ . The recommended menu for restaurant ⁇ is ⁇ , but the account balance of Mr. ⁇ (the name of user U) is insufficient, so it is better to sell the ⁇ ⁇ shares that I have. A message that also serves as advice, such as "OK" is returned.
  • the user U uses the interactive service to interact with the concierge C and the advisor D in every scene of daily life.
  • the concierge C and the advisor D are virtual beings and are not real human beings, so that even a user U who lacks communication ability can actively and actively interact.
  • the communication ability of the user U is unknowingly improved, and even if the other party of the dialogue is not a virtual existence but a real person, the dialogue can be carried out without any problem.
  • the concierge C and the advisor D are provided with genders, for example, when a male user U wants to improve his / her communication ability with a woman, the female concierge C and the advisor D should have a dialogue. it can. Therefore, the male user U can communicate with a real female without any problem.
  • the content of the message actively issued from the advisor D side and the timing of the message are reflected by the content of the user information separately provided from the information providing server 3 that provides the information providing service.
  • the advisor D can include the personal information of the user U and the professional and accurate information in the content of the dialogue with the user U.
  • “User information” includes all information related to the daily life of user U.
  • information on the financial transaction of the user U hereinafter referred to as “user financial information”
  • information on the insurance transaction of the user U hereinafter referred to as “user insurance information”
  • information on medical treatment of the user U hereinafter referred to as “ The user biometric information”
  • the user biometric information and the daily biometric information of the user U are information included in the user information.
  • the user information includes the usage record of the matching service for men and women and the content thereof (hereinafter referred to as “matching information”).
  • the matching information includes the comment of the other party when the matching is unsuccessful, the result of the questionnaire, and the like.
  • the user financial information includes, for example, information on the deposit balance of the user U, the record of transfer and transfer, and the like.
  • the user insurance information includes, for example, information on insurance premiums contracted by the user U, contents of compensation, and the like.
  • the user medical information includes, for example, the content of the illness of the user U, information on medical treatment results, doctor's findings, prescription history of medicines, and the like.
  • the user biometric information includes body temperature from birth of the user U, pulse rate, heart rate, meal content, meal timing, meal time, exercise content and amount, exercise time, sleep time, sleep timing, blood. Components, genetic information, etc. are included.
  • the specific method for acquiring the user biometric information is not particularly limited. What the user U recorded by himself / herself may be user biometric information, or information measured by a predetermined measuring device controlled by the information providing server may be user biometric information.
  • the user financial information, the user insurance information, and the user medical information are information managed by the financial institution, the insurance company, and the medical institution, respectively, and part or all of them are shared with the user U.
  • the user biometric information is information managed by the information providing service, and a part or all of the user biometric information is shared with the user U.
  • FIG. 2 is an image diagram showing the flow of various information provided from the information providing service to this service.
  • each of the user financial information, the user insurance information, and the user medical information is shared between the financial institution, the insurance company, and the medical institution and the information providing service.
  • User financial information, user insurance information, user medical information, and user biometric information are provided from the information providing service to the dialogue service as user information.
  • the amount of the menu of the reserved restaurant, the balance of the deposit account of the user U, and the selling price of the stock (asset) held by the user U are combined as in the dialog between the advisor D and the user U described above. It is possible to have a dialogue that is considered.
  • the user U1 who uses the interactive service to which the information providing service is applied can also follow the life of the user U2 who uses only the information providing service.
  • the following cases can be assumed, for example.
  • the user U2 is a single-living mother and the user U1 is the child.
  • the dialogue service uses the user biometric information for the mother (user U2) in the dialogue between the child (user U1) and the advisor D.
  • User medical information and the like can be reflected.
  • the advisor D can send a message to the child (user U1) that also serves as a status report of the mother (user U2), such as "Mom looks like a cold".
  • the child (user U1) replies, "I'm not quite in touch, I'll contact you", which means that at least the situation of the mother (user U2) has been transmitted to the child. This can also contribute to solving the problem of lonely death caused by the fact that the child does not notice the physical condition change of the parent living alone.
  • the user U2 does not have to be a human.
  • the user U2 since it may be a pet, it can be assumed that the user U2 is a pet and the user U1 is its owner.
  • the dialogue service uses the user biometric information for the pet (user U2) in the dialogue between the owner (user U1) and the advisor D. Can be reflected.
  • the advisor D can send a message that also serves as a pet status report, such as "Isn't XX (pet name) stressed recently due to lack of exercise?"
  • the owner (user U1) replies, "I understand, I'll return soon today", it can contribute to the longevity of the pet, a member of the family.
  • the user U2 does not need to be a living being like a human being or a pet.
  • the user U2 since it may be a car, it is possible to envisage a case where the user U2 is a car and the user U1 is its owner.
  • the dialogue service is used for the dialogue between the owner (user U1) and the advisor D in addition to the user information of the owner (user U1).
  • User information for (user U2) can be reflected. Specifically, for example, if the owner (user U1) tries to drive a car (user U2) while drinking alcohol, the advisor D determines that the owner (user U1) 's user information (particularly the user biometric).
  • Advisor D further told the owner (user U1) that "drinking driving is prohibited by Article 65 of the Road Traffic Act. If you are drunk driving, not only your license will be revoked but also 5 years. You will be punished with the following imprisonment or a fine of 1,000,000 yen or less! ", So that the owner (user U1) can recognize the size of the cost of drinking and driving.
  • the information providing service can be linked with a control unit (not shown) that controls driving of the automobile (user U2), when the owner (user U1) drinks from the user information, It is also possible to control so that the engine of the automobile (user U2) does not start. This can forcibly prevent the owner (user U1) from drinking and driving.
  • the interaction service can reflect the respective user information for both the interaction between the male (user U1) and the advisor D and the conversation between the female (user U2) and the advisor D.
  • the advisor D sends a message to the man (user U1), which also serves as a status report of the woman (user U2), such as "She seems to be depressed due to the quarrel during this time.” be able to.
  • the advisor D can further send a message that also serves as advice, such as “please allow it now?”.
  • the virtual space service is a service that enables the target user U to interact with the 3D avatar A existing in the virtual space VS via the user terminal.
  • “Virtual space” VS refers to a virtual space that includes at least one of augmented reality (AR) and virtual reality (VR).
  • “Avatar” refers to electronic information that represents the alter ego of the user U generated for each user U based on the user information of the user U
  • “3D avatar” refers to the altercation of the user U in three dimensions. Refers to electronic information that is expressed.
  • the “3D avatar” is also simply referred to as “avatar” A unless it is necessary to distinguish between the two.
  • the avatar A includes an augmented reality (AR) avatar Aa and a virtual reality (VR) avatar Av.
  • An augmented reality (AR) avatar Aa is one whose subject is the user U.
  • a virtual reality (VR) avatar Av is one whose subject is completely created by AI (artificial intelligence). In the following, both of them will be simply referred to as “avatars” A unless it is necessary to distinguish them.
  • the user U who uses the virtual space service interacts with the augmented reality (AR) avatar Aa and the virtual reality (VR) avatar Av while acting in the virtual space as the augmented reality (AR) avatar Aa.
  • the user U may interact with the augmented reality (AR) avatar Aa and the virtual reality (VR) avatar Av existing in the virtual space VS, instead of the avatar Aa. Since a large number of avatars A exist in the virtual space VS, the user U has a chance to interact with the avatar A. Accordingly, the user U can improve the communication ability of the user U by being spoken by another avatar A or by actively speaking to the avatar A.
  • production service In the virtual space service, it is also possible to provide the user U with a service (hereinafter, referred to as “production service”) that introduces the avatar A that is a candidate for a relationship partner.
  • the avatar A introduced to the user U by the production service is referred to as “talent” T.
  • the talent T introduced to the user U may be an augmented reality (AR) avatar Aa or a virtual reality (VR) avatar Av. Therefore, the talent T with whom the user U interacts may be the augmented reality (AR) avatar Va or the virtual reality (AR) avatar Av.
  • AR augmented reality
  • VR virtual reality
  • the user information of the user U is considered.
  • the user information also includes matching information.
  • the matching information includes the comment of the other party and the result of the questionnaire when the matching fails. That is, the matching information includes negative information and some positive information about the relationship between the user U and the opposite sex.
  • Negative information and some positive information about user U's relationship with the opposite sex will be important information for the success of user U's next relationship. Therefore, in the production service, the matching information is analyzed, and based on the analysis result, a suitable talent T for the user U to succeed in the relationship is selected and introduced.
  • the specific method of analyzing the usage content of the matching service of the user U is not particularly limited, but it can be analyzed using the following methods (1) to (3), for example. That is, (1) a comparison with other cases in which matching is established is performed. (2) The problem of the user U is grasped by collecting the impressions of the other party for whom the matching is unsuccessful, the questionnaire results and the like. (3) To grasp the personality, the type of preference, etc., based on the past relationship history of the user U. As a result, it is possible to analyze the negative information and a part of the positive information about the relationship of the user U with the opposite sex in detail, so that a suitable talent T is selected in order to succeed the next relationship of the user U. be able to.
  • FIG. 3 is a diagram showing an example of a pattern of the talent T introduced to the user U1 in the production service.
  • patterns 1 to 3 there are three patterns (patterns 1 to 3) shown in FIG. Each pattern will be described below.
  • Pattern 1 is a pattern in which an augmented reality (AR) avatar Va mainly composed of another user U2 is introduced to the user U1 as a talent T.
  • AR augmented reality
  • Aa an avatar A suitable for the user U1 to succeed in the next interaction is selected as the talent T.
  • the selected talent T is processed and set by the production service (hereinafter, referred to as “production side setting”).
  • production side setting the production service
  • details of specific processing and setting in the production side setting are not particularly limited, in the present embodiment, processing and setting of the following elements (1) to (3) are performed in the production side setting. That is, based on the prototype of the talent T created by the production service, processing (1) appearance, (2) character, and (3) knowledge level are processed and set by the production service.
  • the portrait right of the user U2 is taken into consideration when creating the prototype of the talent T by the production service. Specifically, (1) For the appearance, the face and style of the talent T are processed and set. As for (2) personality, as the personality of the talent T, an outward personality, an inward personality, a thoughtful personality, an emotional personality, a personality that values intuition, a personality that relies on senses, and the like are set. (3) With respect to the knowledge level, the hobbies and preferences of the talent T are set. The talent T of the pattern 1 is introduced to the user U1 after the above processing and setting are performed.
  • the user U1 who has been introduced with the talent T performs processing and setting (hereinafter, referred to as “user side setting”) targeting the introduced talent T.
  • user side setting processing and setting
  • the specific contents of processing and setting in the user side setting are not particularly limited, in the present embodiment, the following processing (1) to (3) are performed in the user side setting. That is, based on the introduced talent T, the user U1 processes and sets each of (1) appearance, (2) voice, and (3) language.
  • the (3) language is set to any language such as Japanese or English. Also, even in the same Japanese language, it can be set at the level used in local dialects (Kansai dialect, Fukushima dialect, Akita dialect). In this way, by allowing the user U1 to set the language used by the talent T, the user U1 can set the language he / she wants to use everyday as well as the language he / she wants to learn. . As a result, the user U1 can interact with the talent T in the language he / she wants to learn, and can be expected to actively engage in conversation in the language he / she wants to learn. As a result, it is possible to learn multiple languages in a shorter period of time and improve communication ability.
  • Pattern 2 similarly to Pattern 1, an augmented reality (AR) avatar Aa mainly composed of another user U2 is introduced as a talent T, but as shown in FIG. 3, the talent T is not selected. Instead, the "model” M is provided.
  • the “model” M is, among the talents T provided by the user U2, the talent T that the user U1 has actually contacted in the past, or a person or animal or the like having some relationship. .
  • the person or animal targeted by the model M may or may not be alive.
  • the one-to-one relationship between the user U1 and the model M introduced as the talent T is maintained. That is, since the model M is treated as a unique existence for the user U1, the users U other than the user U1 cannot interact.
  • the father (user U2) can also provide the model M mainly including the dead mother of his / her dead wife, that is, the dead mother of the child (user U1). .
  • the child (user U1) can interact with the avatar A (model M) of the dead mother introduced as the talent T in the virtual space VS.
  • the child (user U1) originally had a high communication ability, but lost the communication ability due to some event (for example, the death of the mother), it may be an opportunity to regain the communication ability. .
  • the husband (user U2) can also provide the model M for the dead pet that was loved by the wife (user U1).
  • the wife (user U1) can interact with the avatar A (model M) of the dead pet introduced as the talent T in the virtual space VS. That is, in the virtual space VS, even if the subject of the avatar A is an animal, it is possible to have a dialogue by anthropomorphization. Therefore, for example, if the wife (user U1) originally had a high communication ability, but lost the communication ability due to some event (for example, the death of a pet), it may be an opportunity to regain the communication ability. .
  • the augmented reality (AR) or virtual reality (VR) avatar Av generated by an external portal site PS2 different from the portal site PS1 used by the user U1 to receive the provision of the virtual space service is talented T.
  • the talent T introduced in the pattern 3 is the complete avatar Au generated in the external portal site PS2, the production-side setting and the user-side setting that are performed in the pattern 1 are not performed. That is, the already existing avatar Au is directly introduced to the user U1 as a talent.
  • the complete avatar Au generated in the external portal site PS2 is introduced to the user U1 as the talent T, so that the user U1 can interact with the talent T having various personalities.
  • the talent T introduced by the above-mentioned pattern 1 intentionally tries to approach the type desired by the user U1 by going through the production side setting and the user side setting. That is, in the pattern 1, the user U1 actively talks with the talent T by bringing the talent T close to the user U1's favorite type.
  • the already existing avatar Au is introduced as it is as the talent T, so that avatars Au of various personalities are introduced as the talent T regardless of the preference of the user U1. .
  • the user U1 can get an opportunity to interact with the avatars Au of any personality regardless of his / her preference.
  • the user U1 can acquire the correspondence power required in the dialogue through the dialogue with the talents T having various personalities, and thus can improve the practical communication ability.
  • the talent T introduced to the user U1 is managed by, for example, the following method at the portal site PS1 operated by the service provider P to provide the virtual space service.
  • the augmented reality (AR) avatar Aa of the talent T is managed based on the content of the exclusive management contract concluded with the service provider P.
  • the virtual reality (VR) avatar Av of the talent T is managed as follows. That is, with respect to the virtual reality (VR) avatar Av, a technique such as machine learning, deep learning, or AI (artificial intelligence) is used to give an opportunity to interact with the user U as the concierge C or the advisor D described above. Then, deep learning progresses due to the accumulation of dialogues with the user U, the communication ability continues to change while being refined, and the character as an ideal avatar is gradually formed.
  • the fact that the communication ability of the virtual reality (VR) avatar Av is improved means that the communication ability of the user U who is the partner of the dialogue is also improved. That is, by accumulating the dialogue between the user U and the virtual reality (VR) avatar Av, both of them can have an ideal personality.
  • the virtual reality (VR) avatar Av which has many opportunities to interact with the user U, is accessed by as many users U as possible.
  • the virtual reality (VR) avatar Av which is rarely accessed, has few opportunities to interact with the user U. If the amount of dialogue with the user U is small, it means that the deep learning is not so advanced. In other words, such a personality of the virtual reality (VR) avatar Av is generally unique because the personality characteristics of the specific user U who has been the other party of the dialogue are deeply expressed.
  • a user U who is recognized to have changed his / her personality by accumulating dialogues with the talent T in the virtual space VS will be treated as follows. That is, based on the user information of the user U who is recognized to have changed his / her personality, an eligible partner (actual opposite sex) in which the hobbies and preferences of the user U are considered is introduced.
  • the service provider P manages the avatar Av of various virtual reality (VR) as the talent T as described above, and operates the portal site PS1 that provides the virtual space service.
  • VR virtual reality
  • FIG. 4 is a diagram showing an outline of the configuration of the information processing system IS including the management server 1 according to the embodiment of the information processing apparatus of the present invention.
  • the information processing system IS shown in FIG. 4 includes a management server 1, user terminals 2-1 to 2-n (n is an arbitrary integer value of 1 or more), an information providing server 3, a financial institution terminal 4, and the like.
  • the medical institution terminal 5 is configured to be connected to each other via a predetermined network N such as the Internet.
  • An external server 6 is connected to the information processing system IS via the network N.
  • the management server 1 is an information processing device managed by a service provider P who provides this service.
  • the management server 1 executes various processes in order to manage the operations of the user terminals 2-1 to 2-n, the information providing server 3, the financial institution terminal 4, and the medical institution terminal 5.
  • Each of the user terminals 2-1 to 2-n is an information processing device operated by each of the users U1 to Un, and is configured by, for example, a personal computer, a smartphone, a tablet, or the like. Note that, hereinafter, when it is not necessary to individually distinguish the users U1 to Un and the user terminals 2-1 to 2-n, they are collectively referred to as "user U" and "user terminal 2". However, the user U does not necessarily operate the user terminal 2, and for example, when the user U is not a human, the user U may not operate the user terminal 2.
  • the information providing server 3 is an information processing device managed by a person who provides various information as a service.
  • the information providing server 3 manages various information applied to the dialogue service and the virtual space service.
  • the financial institution terminal 4 is an information processing device managed by each of the financial institution and the insurance company used by the user U.
  • the financial institution terminal 4 manages financial information and insurance information that can be applied to the dialogue service and the virtual space service.
  • the medical institution terminal 5 is an information processing device managed by a medical institution used by the user U.
  • the medical institution terminal 5 manages medical information that can be applied to the dialogue service and the virtual space service.
  • the external server 6 is an information processing device managed by a person who operates the external portal site PS2.
  • FIG. 5 is a block diagram showing a hardware configuration of the management server 1 of FIG.
  • the management server 1 includes a CPU (Central Processing Unit) 11, a ROM (Read Only Memory) 12, a RAM (Random Access Memory) 13, a bus 14, an input / output interface 15, an output unit 16, and an input unit 17.
  • the storage unit 18, the communication unit 19, and the drive 20 are provided.
  • the CPU 11 executes various processes according to the program recorded in the ROM 12 or the program loaded from the storage unit 18 into the RAM 13.
  • the RAM 13 also stores data and the like necessary for the CPU 11 to execute various processes.
  • the CPU 11, the ROM 12, and the RAM 13 are connected to each other via the bus 14.
  • An input / output interface 15 is also connected to the bus 14.
  • An output unit 16, an input unit 17, a storage unit 18, a communication unit 19, and a drive 20 are connected to the input / output interface 15.
  • the output unit 16 is composed of various liquid crystal displays and outputs various information.
  • the input unit 17 is made of various hardware such as lead, and inputs various information.
  • the storage unit 18 is configured by a DRAM (Dynamic Random Access Memory) or the like, and stores various data.
  • the communication unit 19 includes other devices (for example, the user terminals 2-1 to 2-n in FIG. 1, the information providing server 3, the financial institution terminal 4, the medical institution terminal 5, and the external server) via the network N including the Internet. 6) Control communication with and.
  • the drive 20 is provided as needed.
  • a removable medium 30 including a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is appropriately mounted on the drive 20.
  • the program read from the removable medium 30 by the drive 20 is installed in the storage unit 18 as needed.
  • the removable medium 30 can also store various data stored in the storage unit 18 similarly to the storage unit 18.
  • FIG. 6 is a functional block diagram showing an example of the functional configuration of the management server 1 of FIG.
  • the information acquisition unit 101 and the message management unit 102 function.
  • the information acquisition unit 101, the space management unit 103, and the avatar management unit 104 function.
  • the information acquisition unit 101 and the avatar management unit 104 function.
  • a user DB 401, an avatar DB 402, a dialogue DB 403, and an auxiliary DB 404 are provided in one area of the storage unit 18.
  • Dialog processing means a series of processes for realizing the above-mentioned dialogue service. By executing the interaction process, the user U can interact with the concierge C or the advisor D.
  • Virtual space provision processing means a series of processing for realizing the above-mentioned virtual space service.
  • the user U can interact with the avatar in the virtual space.
  • Production process means a series of processes for realizing the above-mentioned production service. By executing the production process, it becomes possible to introduce the talent to the user U.
  • the information acquisition unit 101 acquires information about a user as user information. Specifically, the information acquisition unit 101 acquires user information managed by the information providing server 3. As described above, the user information includes user financial information, user insurance information, user medical information, user biometric information, and matching information. The user information acquired by the information acquisition unit 101 is stored and managed in the user DB 401.
  • the message management unit 102 manages a message transmitted / received as a dialogue via the user terminal 2. Messages transmitted and received as dialogues via the user terminal 2 are stored and managed in the dialogue DB 403. Specifically, in the message management unit 102, the reception control unit 121, the message generation unit 122, and the transmission control unit 123 function. The reception control unit 121 executes control for receiving the message transmitted from the user terminal 2. When there is a message from the user, the message generation unit 122 generates a message to be transmitted to the user based on the content of the message and the user information. If there is no message from the user, a message to be transmitted to the user is generated based on the user information.
  • the message generation unit 122 transmits the message to the user terminal 2 based on the content of the message and the user information of the user U. Generate a message.
  • a message to be transmitted to the user terminal 2 is generated based on the user information of the user U.
  • Techniques such as machine learning, deep learning, and AI (artificial intelligence) can be used for the message generation by the message generation unit 122.
  • the transmission control unit 123 executes control for transmitting the message generated by the message generation unit 112 to the user terminal 2.
  • the above-described processing executed by the message management unit 102 may be executed in real time or may be executed at predetermined time intervals.
  • the space management unit 103 generates a virtual space VS accessible via the user terminal 2 and manages the generated virtual space VS.
  • the avatar management unit 104 manages one or more avatars A that can exist in the virtual space VS. Specifically, in the avatar management unit 104, the avatar generation unit 141, the selection unit 142, the processing setting unit 143, the setting reception unit 144, and the model reception unit 145 function.
  • the avatar generation part 141 produces
  • the avatars A1 to Am (m is an arbitrary integer value of 1 or more) generated by the avatar generation unit 141 are stored and managed in the avatar DB 402.
  • the selection unit 142 selects, as the talent, an avatar that is a candidate for a companionship partner of the user in the virtual space from the one or more avatars. Specifically, the selection unit 142 selects, as the talent T, an avatar A that is a candidate for a contact partner of the user U in the virtual space VS among the avatars A1 to Am.
  • the process setting unit 143 processes and sets one or more elements of the avatar selected as the talent. Specifically, the processing setting unit 143 performs processing and settings for each element such as the appearance, character, and knowledge level of the avatar A selected as a talent. For example, as described above, the face and the style of the appearance of the avatar A are processed and set. Further, the personality of the avatar A is set such as an outward personality, an inward personality, a thoughtful personality, an emotional personality, a personality that values intuition, and a personality that relies on feelings. Further, hobbies and preferences are set for the intellectual level of the avatar A.
  • the setting reception unit 144 receives processing and setting by the user for the avatar introduced to the user as a talent. Specifically, the setting reception unit 144 receives processing and setting by the user U for the avatar A introduced to the user U as the talent T. The received processing and setting contents are reflected in the target avatar A.
  • the model reception unit 145 receives, as a model, an avatar that is provided by a second user different from the first user and that has a strong relationship with the first user. Specifically, the model receiving unit 145 receives, as a model M, an avatar A that is provided by the user U2 and mainly has a strong relationship with the user U1. For example, as described above, when the user U2 is the father of the user U1, the father (user U2) provides the model M that targets his or her late wife, that is, the mother of the child (user U1). You can also do it.
  • the user information of the user U registered as a user in this service is stored and managed in the user UDB 401.
  • the user information includes user financial information, user insurance information, user medical information, user biometric information, matching information, and the like.
  • the avatar DB 402 stores and manages the augmented reality (AR) avatar Aa, the virtual reality (VR) avatar Av, and the avatar Au provided from the external portal site PS2, which are generated by the avatar generation unit 141. Has been done.
  • AR augmented reality
  • VR virtual reality
  • Au provided from the external portal site PS2
  • the dialogue DB 403 stores and manages the contents of dialogues between the user U, the concierge C, the advisor D, and the avatar A. Specifically, for example, each message that constitutes the dialogue, such as "Mother looks like a cold” or “I'm not quite in touch with you” mentioned above, and the person who issued the message and the message It is stored in association with information indicating the target time and the like.
  • the auxiliary DB 404 stores and manages various kinds of information other than user information among various kinds of information provided from the information providing server 3. Specifically, for example, the information that the drunk driving is prohibited by Article 65 of the Road Traffic Act, which is included in the content of the above-mentioned dialogue, is stored in the auxiliary DB 404 because it is included in various information other than the user information. .
  • the management server 1 having the above-described functional configuration, the problem of communication ability, the problem of appearance, the problem of residence, and the problem of time are solved, and the real life of the user U is improved. Can be contributed.
  • the virtual reality (VR) avatar Av of the talent T learns by techniques such as machine learning, deep learning, and AI (artificial intelligence) by repeating the conversation with the user U. Go. That is, the avatar Av of the talent T continues to change while refining communication ability. Eventually, the personality of the avatar Av of the talent T is gradually formed as an ideal one. Such an avatar whose personality is formed while talking is not limited to the virtual reality (VR) avatar Av of the talent T.
  • the user U may learn the avatar of the user U by repeating communication with the server. That is, the avatar Aa of the user U may continue to change while refining the communication ability. And finally, the avatar Aa may be gradually formed to have an ideal personality. As a result, the avatar Av as a copy having the personality of the user U is generated.
  • the avatar Aa of the user U learns using the user information acquired and accumulated mainly through the communication between the user U and the server before the user U is alive, and after the death of the user U, The user becomes himself by mainly performing deep learning. That is, the avatar, which is the copy of the user U in the virtual space, can be an advisor, an intermediary, an waiverer, and a family while the user U is alive. Then, the avatar of the user U continues to exist even if the human life of the user U itself is cut off. This means that the consciousness of the user U can live forever in the virtual space. In other words, this avatar, which is a copy of the user U, can be said to be an evolved version of humankind.
  • a person in the real world uses the VR device to communicate with the avatar of the user U in the virtual space, so that the user U (a copy of the avatar) returns to the real world even after the user U dies. It will be possible. In this way, since the copy (avatar) of the user U can continue to operate in the real world (real world) even after the death of the user U, it is possible to put an end to the property right discussion of personal information.
  • the avatars of individuals (all human beings) in the world that is, a copy of the user U who can freely take in and out user information
  • user information of individuals (all human beings) in the world is stored as big data on the cloud or the like. That is, in contrast to the conventional information management method in which personal information is stored in each company or each national institution, user information of individuals (all human beings) in the world is stored in a server equipped with AI. As a result, companies and national institutions will be able to use these big data. Furthermore, since the collected big data becomes collective unconscious, the use of this big data will widely contribute to society. Therefore, it is preferable that the server is incorporated into public infrastructure.
  • the user information is said to include user financial information, user insurance information, user medical information, user biometric information, and matching information, but these are merely examples.
  • Other information about the user U can be the user information.
  • the dialogue between the user U and the concierge C or the advisor D is performed by exchanging voice information via the microphone of the user terminal 2 and the speaker. Further, the dialogue between the user U and the talent T or the model M is performed by exchanging information obtained by adding image information displayed on the screen of the user terminal 2 to voice information. That is, in the above-described embodiment, it is assumed that the user terminal 2 is a smartphone, a tablet terminal, or a personal computer. However, the user terminal 2 is not limited to these, and may be a combination of, for example, a speaker having only a function of outputting sound, a projector having only a function of displaying an image, and a telephone having only a communication function by sound. Good.
  • the “dialogue” in the above-described embodiment is supposed to exchange messages in real time, but this is merely an example.
  • the dialogue between the user U and the concierge C, the talent T, or the like may be a mail-type dialogue in which it is selected that a time interval occurs in exchanging messages.
  • the target of the production setting in the production service is three types of elements including the appearance, character, and knowledge level of the selected talent T, but this is merely an example.
  • Various elements indicating the selected talent T can be set.
  • the voice or language which is the target of the user setting, may be the target of setting, or the family structure, family pattern, etc. may be the target of setting.
  • the avatar Au generated on the portal site operated by the external server is introduced to the user U as the talent as it is, but the invention is not limited to this.
  • the above-mentioned production-side setting may be performed, or the user-side setting may be performed.
  • the hardware configuration shown in FIG. 5 is merely an example for achieving the object of the present invention, and is not particularly limited.
  • the functional block diagram shown in FIG. 6 is merely an example and is not particularly limited. That is, it is sufficient if the information processing system has a function capable of executing the above-described series of processing as a whole, and what kind of functional block is used to realize this function is not particularly limited to the example of FIG. .
  • the location of the functional block is not limited to that shown in FIG. 6 and may be arbitrary. Further, one functional block may be configured by hardware alone, software alone, or a combination thereof.
  • the program forming the software is installed in a computer or the like from a network or a recording medium.
  • the computer may be a computer embedded in dedicated hardware. Further, the computer may be a computer capable of executing various functions by installing various programs, for example, a general-purpose smartphone or a personal computer other than a server.
  • a recording medium including such a program is provided separately from the apparatus main body to provide the program to each user. It is composed of provided recording media and the like.
  • steps for describing a program to be recorded on a recording medium are not limited to processing performed in chronological order according to the order, but are not necessarily performed in chronological order. This also includes the processing to be executed.
  • system refers to an entire device including a plurality of devices and a plurality of means.
  • the information processing apparatus to which the present invention is applied may have various configurations as long as it has the following configuration. That is, the information processing apparatus (for example, the management server 1) to which the present invention is applied is An acquisition unit (for example, the information acquisition unit 101 in FIG. 6) that acquires information about a user (for example, the user U1 in FIG. 4) as user information, Message management means (for example, message management unit 102 in FIG. 6) that manages messages (for example, dialog) transmitted and received as a dialog via the user terminal operated by the user (for example, user terminal 2-1 in FIG.
  • an information processing device including The message management means, When there is a message from the user, a message to be transmitted to the user is generated based on the content of the message and the user information, and when there is no message from the user, based on the user information And generate a message to convey to the user. This can solve the problem of communication ability and contribute to the improvement of the real life of the user U.
  • a space management unit (for example, space management unit 103 in FIG. 6) that generates a virtual space (for example, virtual space VS) accessible through the user terminal and manages the generated virtual space
  • Avatar management means (for example, avatar management unit 104 in FIG. 6) that manages one or more avatars (for example, avatar A) that can exist in the virtual space
  • the avatar management means includes a first avatar (e.g., augmented reality (AR) avatar Aa) whose subject is the user, and a second avatar (e.g., virtual reality (VR) avatar Av) in which the subject is completely created. And can be generated. This can solve the problem of appearance, the problem of the place of residence, and the problem of time, and contribute to the improvement of the real life of the user U.
  • AR augmented reality
  • VR virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

La présente invention ‌a‌ ‌pour‌ ‌objet‌ de‌ fournir un procédé qui contribue à améliorer la vie réelle d'un utilisateur en résolvant des problèmes de compétences de communication, d'apparence, de lieu de résidence et de temps. À cet effet, l'invention porte sur une unité d'acquisition d'informations 101 acquiert des informations d'utilisateur concernant un utilisateur U. Une unité de gestion de message 102 gère des messages qui sont transmis et reçus par l'intermédiaire d'un terminal utilisateur 2. En présence d'un message provenant de l'utilisateur U, l'unité de gestion de message 102 génère, sur la base du contenu du message et des informations d'utilisateur, un message pour l'utilisateur U. En l'absence d'un message de l'utilisateur U, l'unité de gestion de message 102 génère un message pour l'utilisateur U sur la base des informations d'utilisateur. De cette manière, le problème de la présente invention est résolu.
PCT/JP2019/039355 2018-10-12 2019-10-04 Dispositif de traitement d'informations WO2020075647A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020511828A JP7002085B2 (ja) 2018-10-12 2019-10-04 情報処理装置
JP2021203938A JP7468850B2 (ja) 2018-10-12 2021-12-16 情報処理装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-193394 2018-10-12
JP2018193394 2018-10-12

Publications (1)

Publication Number Publication Date
WO2020075647A1 true WO2020075647A1 (fr) 2020-04-16

Family

ID=70164607

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/039355 WO2020075647A1 (fr) 2018-10-12 2019-10-04 Dispositif de traitement d'informations

Country Status (2)

Country Link
JP (2) JP7002085B2 (fr)
WO (1) WO2020075647A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009134394A (ja) * 2007-11-29 2009-06-18 Sony Corp 情報処理装置、情報処理方法、及びプログラム
JP2015194864A (ja) * 2014-03-31 2015-11-05 Kddi株式会社 遠隔操作方法ならびにシステムならびにそのユーザ端末および視聴端末
JP2016071604A (ja) * 2014-09-30 2016-05-09 株式会社日本総合研究所 情報処理装置、情報処理プログラムおよび情報処理方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003067474A (ja) * 2001-08-22 2003-03-07 Hiroyuki Tarumi 物故者データ処理システム
US20080262786A1 (en) 2007-04-19 2008-10-23 The University Of Houston System Non-exercise activity thermogenesis (neat) games as ubiquitous activity based gaming
JP2012168862A (ja) * 2011-02-16 2012-09-06 Nomura Research Institute Ltd 行動情報記録装置
JP2012168863A (ja) * 2011-02-16 2012-09-06 Nomura Research Institute Ltd 行動情報記録装置
US9489679B2 (en) * 2012-10-22 2016-11-08 Douglas E. Mays System and method for an interactive query utilizing a simulated personality
JP2016045815A (ja) 2014-08-26 2016-04-04 泰章 岩井 仮想現実提示システム、仮想現実提示装置、仮想現実提示方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009134394A (ja) * 2007-11-29 2009-06-18 Sony Corp 情報処理装置、情報処理方法、及びプログラム
JP2015194864A (ja) * 2014-03-31 2015-11-05 Kddi株式会社 遠隔操作方法ならびにシステムならびにそのユーザ端末および視聴端末
JP2016071604A (ja) * 2014-09-30 2016-05-09 株式会社日本総合研究所 情報処理装置、情報処理プログラムおよび情報処理方法

Also Published As

Publication number Publication date
JPWO2020075647A1 (ja) 2021-02-15
JP2022027940A (ja) 2022-02-14
JP7468850B2 (ja) 2024-04-16
JP7002085B2 (ja) 2022-01-20

Similar Documents

Publication Publication Date Title
US11431660B1 (en) System and method for collaborative conversational AI
CN105912848B (zh) 一种基于app的医疗服务系统
EP3941340A1 (fr) Procédés et dispositifs de thérapie numérique personnalisée
US10831866B2 (en) Systems and methods for facilitating remote care services
Hooper et al. Smart-device environmental control systems: experiences of people with cervical spinal cord injuries
Laxmidas et al. Commbo: Modernizing augmentative and alternative communication
CN111985891A (zh) 一种基于物联网、移动互联网和云计算技术的互联网养老生态圈服务平台系统
Yuan et al. A simulated experiment to explore robotic dialogue strategies for people with dementia
WO2020075647A1 (fr) Dispositif de traitement d'informations
WO2019104411A1 (fr) Système et procédé pour gestion de maladie activée par la voix
Postolache et al. Contextual design of ICT for physiotherapy: toward knowledge and innovation ecosystem
Perez et al. CMSA’s integrated case management: A manual for case managers by case managers
Ahmad IRING TemaniKu: a grab-style integrated application of e-healthcare chaperone services for the elderly living at home in Malaysia
WO2016172665A1 (fr) Systèmes et procédés pour l'administration de services de soins à distance
Ferreira How To Be A Digital Doctor
US11848110B2 (en) Secure patient messaging
Saranto et al. Personal health information management: tools and strategies for citizens' engagement
Osei Stayer youth shaping their transnational family lives: experiences and aspirations of migrants’ children living in Ghana
Priday et al. Tracking Person-Centred Care Experiences Alongside Other Success Measures in Hearing Rehabilitation
Hansen Shaping aged care work through technology: A senior manager affordance perspective
US20240177085A1 (en) Systems and methods for ensuring quality of care services
Hinckley Facilitating life participation in severe aphasia with limited treatment time
Hughes Family and staff perspectives on quality of life, well-being and human rights for people with advanced dementia living in care homes: a case study approach
Fedestus M-Health application for malaria health care workers
Allen Understanding the networks of those using the internet to support self-management and the role of ties mediated online in supporting long-term condition management. A mixed-methods study

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2020511828

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19870622

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19870622

Country of ref document: EP

Kind code of ref document: A1