WO2018168998A1 - Bot制御管理プログラム、方法、装置、及びシステム - Google Patents

Bot制御管理プログラム、方法、装置、及びシステム Download PDF

Info

Publication number
WO2018168998A1
WO2018168998A1 PCT/JP2018/010218 JP2018010218W WO2018168998A1 WO 2018168998 A1 WO2018168998 A1 WO 2018168998A1 JP 2018010218 W JP2018010218 W JP 2018010218W WO 2018168998 A1 WO2018168998 A1 WO 2018168998A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
bot
message
user
displayed
Prior art date
Application number
PCT/JP2018/010218
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
谷口 友彦
祥平 二木
秀之 垣内
徳大 松野
ミンボ バエ
貴大 山田
瑛実 佐藤
Original Assignee
Line株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Line株式会社 filed Critical Line株式会社
Publication of WO2018168998A1 publication Critical patent/WO2018168998A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units

Definitions

  • the present invention relates to an apparatus or the like on which a computer program (BOT) for performing work or the like on behalf of a human being is executed, and more specifically to an apparatus, method, system, or the like for controlling and managing a BOT.
  • BOT computer program
  • Patent Document 1 a virtual online robot (BOT) or a security agent is provided (Patent Document 1).
  • Patent Document 1 can monitor online conversations and meets certain definable criteria that indicate that the content of the conversation is inappropriate or, in some cases, potentially harmful or dangerous.
  • a BOT also called “Guardian” is disclosed.
  • the guardian is configured to utilize natural language processing techniques to evaluate the context of the conversation over its span and / or evaluate conversations between individual users. By using natural language processing techniques, the guardian can provide notifications or warnings to individual users when it is determined that there is a problem with a specific conversation or part thereof.
  • Patent Document 2 a system for managing the use of BOT by computer network users is provided.
  • Patent Document 2 discloses a method of managing the use of a computer service by a computer user by causing the computer to execute the method.
  • the network device Receiving a request from an actual computer user associated with the computer and identifying a virtual user other than the one or more computer users, the information generating associated with the virtual user Using the information associated with the virtual user to send a communication from the network device to a computer service for the actual computer user that provided the request, Communication originated from the virtual user
  • the transmitting step which prevents the computer service from being associated with the actual computer user who provided the request with the communication, and in the network device,
  • a method comprising: receiving a communication for the virtual user originating from a computer service, wherein the communication is sent to the actual computer user who provided the request. Has been.
  • Patent Document 1 discloses a technique for determining or monitoring whether or not the content of online conversation monitored by the guardian (BOT) is appropriate. is not.
  • Patent Document 2 discloses a technique for managing the use of a public BOT by a private computer network user. However, the conversation itself is developed by actively supporting the conversation between users via the BOT. Not intended.
  • a BOT control management program includes a first user associated with a first information processing terminal and one or more second users associated with one or more second information processing terminals.
  • a program for assisting conversation in a participating talk room wherein the first information processing terminal displays a first object in response to a message displayed in the talk room, and the first object is selected.
  • a step of displaying a second object and a step of displaying a third object in response to selection of the second object are executed.
  • FIG. 1 is a diagram illustrating a configuration of a BOT control management system according to an embodiment of the present disclosure.
  • one or more servers server 110, server 120
  • one or more terminals terminal 151, terminal 152, terminal 153
  • the servers 110 and 120 provide a service that realizes transmission and reception of messages between the terminals 151 to 153 owned by the user via the network 199.
  • the number of terminals connected to the network 199 is not limited.
  • the number of servers is not limited (a server for expanding functions can be added as needed. Service may be provided by one server).
  • the network 199 plays a role of connecting one or more servers and one or more terminals. That is, the network 199 means a communication network that provides a connection path so that data can be transmitted and received after the terminals 151 to 153 are connected to the servers 110 to 120.
  • the network 199 may be a wired network or a wireless network.
  • the network 199 includes an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), and a wireless LAN (WLAN). ), Wide area network (WAN), wireless WAN (wireless WAN), metropolitan area network (MAN), part of the Internet, public switched telephone network (PSTN) Part of mobile phone networks, ISDNs (integrated services digital networks), wireless LANs, LTE (long term evolution) CDMA (code division multiple access), Bluetooth (Bluetooth (registered trademark)), satellite communications, etc., or these It may include two or more thereof.
  • the network 199 is not limited to these.
  • the network 199 can also include one or more networks.
  • the terminal may be any terminal as long as it is an information processing terminal capable of realizing the functions described in each embodiment.
  • the terminals 151 to 153 are typically smartphones, and in addition, mobile phones (for example, feature phones), computers (for example, desktops, laptops, tablets, etc.), media computer platforms (for example, cables, satellite set-top boxes) Digital video recorders), handheld computer devices (eg, personal digital assistants (PDAs), email clients, etc.), wearable terminals (glass-type devices, watch-type devices, etc.), or other types of computers, or communication platforms.
  • the terminals 151 to 153 are not limited to these. Further, the terminals 151 to 153 may be expressed as information processing terminals.
  • the terminal 151 will be described as a representative, and hereinafter, the terminal 152 and the terminal 153 will be described as other terminals as necessary. .
  • the server 110 will be described as a representative.
  • the server 110 has a function of providing a predetermined service to the terminal 151 and the like.
  • the server 110 may be any device as long as it is an information processing device that can realize the functions described in each embodiment.
  • the server 110 is typically a server device, in addition to a computer (eg, desktop, laptop, tablet, etc.), media computer platform (eg, cable, satellite set-top box, digital video recorder), handheld computing device ( For example, PDA, e-mail client, etc.), or other types of computers or communication platforms.
  • the server 110 is not limited to these. Further, the server 110 may be expressed as an information processing apparatus.
  • HW Hardware
  • the terminal 151 includes a control device 1510 (CPU: central processing unit), a storage device 1515, a communication I / F 1516 (interface), an input / output device 1517, a display device 1518, a microphone 1519a, a speaker 1519b, and a camera 1519c. Prepare.
  • the components of the HW of the terminal 151 are connected to each other via, for example, the bus B2.
  • the communication I / F 1516 transmits and receives various data via the network 199.
  • the communication may be executed either by wire or wireless, and any communication protocol may be used as long as mutual communication can be executed.
  • the communication I / F 1516 has a function of executing communication with the server 110 via the network 199.
  • the communication I / F 1516 transmits various data to the server 110 in accordance with instructions from the control device 1510.
  • the communication I / F 1516 receives various data transmitted from the server 110 and transmits the data to the control device 1510.
  • the input / output device 1517 includes a device that inputs various operations to the terminal 151 and a device that outputs a processing result processed by the terminal 151.
  • the input device and the output device may be integrated, or the input device and the output device may be separated (in the latter case, as an example, the input device 1517a such as a touch panel input sensor).
  • an output device 1517b such as a touch panel output device or a vibration drive mechanism).
  • the input device is realized by any one or a combination of all types of devices that can receive input from the user and transmit information related to the input to the control device 1510.
  • the input device is typically realized by a touch panel or the like, detects contact by a pointing tool such as a user's finger or stylus and the contact position, and transmits the coordinates of the contact position to the control device 1510.
  • the input device may be realized by an input device other than the touch panel.
  • the input device includes, for example, hardware keys typified by a keyboard and the like, a pointing device such as a mouse, a camera (operation input via a moving image), and a microphone (operation input by sound).
  • the input device is not limited to these.
  • the output device is realized by any one or a combination of all types of devices that can output the processing result processed by the control device 1510.
  • the output device is typically realized by a touch panel or the like.
  • the output device may be realized by an output device other than the touch panel.
  • a speaker audio output
  • a lens for example, 3D (three-dimensions) output or hologram output
  • printer and the like can be included.
  • the output device is not limited to these.
  • the display device 1518 is realized by any one or a combination of all types of devices that can display according to display data written in a frame buffer (not shown).
  • the display device 1518 is typically realized by a monitor (for example, a liquid crystal display or OELD (organic electroluminescence display)).
  • the display device 1518 may be a head mounted display (HDM: Head Mounted Display).
  • the display device 1518 may be realized by a device that can display an image, text information, or the like in projection mapping, a hologram, or in the air (may be vacuum). Note that these display devices 1518 may be capable of displaying display data in 3D. However, in the present disclosure, the display device 1518 is not limited thereto.
  • the input / output device 1517 is a touch panel
  • the input / output device 1517 and the display device 1518 may be configured to have substantially the same size and shape, and may be arranged to face each other.
  • the control device 1510 has a physically structured circuit for executing functions realized by codes or instructions included in the program, and is realized by, for example, a data processing device built in hardware. .
  • the control device 1510 is typically a central processing unit (CPU), and in addition, a microprocessor, a processor core, a multiprocessor, an ASIC (application-specific integrated circuit), and an FPGA. (Field programmable gate array). However, in the present disclosure, the control device 1510 is not limited to these.
  • the storage device 1515 has a function of storing various programs and various data necessary for the terminal 151 to operate.
  • the storage device 1515 is realized by various storage media such as a hard disk drive (HDD), a solid state drive (SSD), a flash memory, a random access memory (RAM), and a read only memory (ROM).
  • HDD hard disk drive
  • SSD solid state drive
  • RAM random access memory
  • ROM read only memory
  • the storage device 1515 is not limited to these.
  • the terminal 151 stores the program P in the storage device 1515 and executes the program P, whereby the control device 1510 executes processing as each unit included in the control device 1510. That is, the program P stored in the storage device 1515 causes the terminal 151 to realize each function executed by the control device 1510.
  • the microphone 1519a is used for inputting voice data.
  • the speaker 1519b is used for outputting audio data.
  • the camera 1519c is used for acquiring moving image data.
  • the server 110 includes a control device 1110 (CPU), a storage device 1105, an input / output device 1106, a display 1107, and a communication I / F 1108 (interface). Each component of the HW of the server 110 is connected to each other via a bus B1, for example.
  • the control device 1110 has a physically structured circuit for executing functions realized by codes or instructions included in the program, and is realized by, for example, a data processing device built in hardware. .
  • the control device 1110 is typically a central processing unit (CPU), and may be a microprocessor, a processor core, a multiprocessor, an ASIC, or an FPGA. However, in the present disclosure, the control device 1110 is not limited to these.
  • the storage device 1105 has a function of storing various programs and various data necessary for the server to operate.
  • the storage device 1105 is realized by various storage media such as an HDD, an SSD, and a flash memory. However, in the present disclosure, the storage device 1105 is not limited to these.
  • the input / output device 1106 is realized by a device that inputs various operations to the server 110.
  • the input / output device 1106 is realized by any one or a combination of all types of devices that can receive input from a user and transmit information related to the input to the control device 1110.
  • the input / output device 1106 is typically realized by a hardware key typified by a keyboard or the like, or a pointing device such as a mouse.
  • the input / output device 1106 may include, for example, a touch panel, a camera (operation input via a moving image), and a microphone (operation input by sound).
  • the input / output device 1106 is not limited thereto.
  • the display 1107 is typically realized by a monitor (for example, a liquid crystal display or OELD (organic electroluminescence display)).
  • the display 1107 may be a head mounted display (HDM) or the like. Note that these displays 1107 may be capable of displaying display data in 3D. However, in the present disclosure, the display 1107 is not limited to these.
  • a communication I / F 1108 transmits and receives various data via the network 199.
  • the communication may be executed either by wire or wireless, and any communication protocol may be used as long as mutual communication can be executed.
  • the communication I / F 1108 has a function of executing communication with the terminal 151 via the network 199.
  • Communication I / F 1108 transmits various data to terminal 151 in accordance with instructions from control device 1110.
  • the communication I / F 1108 receives various data transmitted from the terminal 151 and transmits the data to the control device 1110.
  • the server 110 stores the program P in the storage device 1105, and the control device 1110 executes processing as each unit included in the control device 1110 by executing the program P. That is, the program P stored in the storage device 1105 causes the server 110 to realize each function executed by the control device 1110.
  • control device 1510 of the terminal 151 and / or the control device 1110 of the server 110 is not only a CPU but also a logic circuit formed in an integrated circuit (IC (Integrated Circuit) chip, LSI (Large Scale Integration)) or the like. Each processing may be realized by (hardware) or a dedicated circuit. These circuits may be realized by one or a plurality of integrated circuits, and a plurality of processes shown in the embodiments may be realized by a single integrated circuit.
  • An LSI may also be referred to as a VLSI, super LSI, ultra LSI, or the like depending on the degree of integration.
  • program P software program / computer program
  • the storage medium can store the program in “a tangible medium that is not temporary”.
  • the storage medium may be one or more semiconductor-based or other integrated circuits (ICs), such as field programmable gate arrays (FPGAs) or application specific ICs (ASICs), hard Disk drive (HDD), hybrid hard drive (HHD), optical disk, optical disk drive (ODD), magneto-optical disk, magneto-optical drive, floppy diskette, floppy disk drive (FDD), magnetic tape, solid state drive (SSD), RAM drive, secure digital card or drive, any other suitable storage medium, or any suitable combination of two or more thereof.
  • ICs such as field programmable gate arrays (FPGAs) or application specific ICs (ASICs), hard Disk drive (HDD), hybrid hard drive (HHD), optical disk, optical disk drive (ODD), magneto-optical disk, magneto-optical drive, floppy diskette, floppy disk drive (FDD), magnetic tape, solid state drive (SSD), RAM drive, secure digital card or drive, any other suitable storage medium, or any suitable combination of two or more thereof.
  • the server 110 and / or the terminal 151 can realize the functions of a plurality of functional units shown in each embodiment by, for example, reading the program P stored in the storage medium and executing the read program P.
  • the program P may be provided to the server # or the terminal # via any transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
  • the server 110 and / or the terminal 151 execute the program P downloaded via the Internet or the like, thereby realizing the functions of a plurality of functional units shown in each embodiment.
  • Each embodiment of the present disclosure may also be realized in the form of a data signal embedded in a carrier wave, in which the program P is embodied by electronic transmission.
  • At least a part of the processing in the server 100 and / or the terminal 151 may be realized by cloud computing configured by one or more computers.
  • the server 100 may be configured to perform at least part of the processing in the terminal 151. In this case, for example, at least a part of the processing of each functional unit of the control device 1510 of the terminal 151 may be performed by the server 100. It is good also as a structure which performs at least one part of the process in the server 100 by the terminal 151.
  • each functional unit of the control device 1110 of the server 100 may be performed by the terminal 151.
  • the configuration of determination is not essential, and it is natural that predetermined processing may be performed when the determination condition is satisfied, or predetermined processing may be performed when the determination condition is not satisfied.
  • the program of the present disclosure can be implemented using, for example, a script language such as ActionScript or JavaScript (registered trademark), an object-oriented programming language such as Objective-C or Java (registered trademark), or a markup language such as HTML5. .
  • a script language such as ActionScript or JavaScript (registered trademark)
  • an object-oriented programming language such as Objective-C or Java (registered trademark)
  • a markup language such as HTML5.
  • the present disclosure is not limited to these.
  • FIG. 2 shows a configuration example of a system or apparatus showing variations of the embodiment of the present invention.
  • the characteristic processing operation in the configuration shown in FIG. 3 (and FIG. 3) is not realized only by the cooperative operation with the server or the terminal via the network, but can also be realized on the terminal alone. Is a point.
  • the information processing system 200 includes, as its minimum configuration, an information processing server 210 and various information processing terminals used by the players (exemplarily shown in the figure, PCs 220 and 230, a mobile phone 240). , Portable information terminals or tablet terminals 250 to 252. Hereinafter, they are also collectively referred to as “terminals”), and between the server and various terminals, as shown in FIG. Or the like (wired lines 260, 270, 280, 290) so that they can communicate with each other. Further, the line may be wired or wireless, and in the case of wireless, the mobile phone 240 and the portable information terminal or tablet terminal 250 enter the Internet 290 wirelessly via a base station, a wireless router, etc. (not shown). Further, they are connected to the information processing server 210 via a line 280 so as to be able to communicate with each other.
  • a program or software necessary for implementing the present invention is normally installed or stored in an HDD or SSD in a storage unit of a PC or a portable information terminal.
  • a memory in the storage unit is used as necessary.
  • DSP digital signal processor
  • the hardware configuration of the information processing server 210 can basically adopt a PC.
  • the information processing server 210 is configured to parallelize a plurality of PCs (for example, several tens to several tens of thousands) in order to increase the hardware specifications as necessary. It is also possible to adopt a configuration suitable for processing large-scale data by operating in an automatic manner.
  • FIG. 3 shows a basic operation example in the system or apparatus according to the embodiment of the present invention exemplified in FIG. 1 or FIG. What is particularly desired to be shown here is that the present invention can be realized by a cooperative operation with a server or a terminal via a network, or on a single terminal.
  • “user terminals” correspond to the terminals 151 to 153 in FIG. 1 and the terminals 220, 230, 240, and 250 to 252 in FIG. 2, and “information processing servers” are the servers 110, 120, This corresponds to the information processing server 210 in FIG.
  • t1 to t10 indicate a time-series flow, and operations and processes described later are performed over time.
  • the operation or processing time (t1 etc.) illustrated in the embodiment is illustrated for easy understanding of the concept of the present invention, and the individual time series relationship illustrated in the embodiment of the present invention. It is not limited to.
  • the user downloads application software for operating his / her user terminal as the information processing terminal according to the present invention from the information processing server via the user terminal (step S301).
  • This application software is client software or application software for processing part or all of the program according to the present invention.
  • the downloaded application software is installed in the user terminal (step S302).
  • the user terminal uploads profile information as shown in the following table to the information processing server in addition to the user's own mail address as user registration as necessary (step S303) and manages the registration. (Step S304).
  • the above data items are stored as user data in a storage device on the information processing server (step S305).
  • the user player
  • the server can start providing services to the terminal.
  • step S306 the user who downloaded and installed the application on the user terminal activates the application software at time t4 (step S306). From time t4 to time t5, the user enjoys the service provided by the information processing terminal.
  • the user interrupts or terminates the application software according to the embodiment of the present invention.
  • the status information of the application is transferred to the information processing server (step S307), and the server receives this and updates the status information as the user information of the user (step S308) and saves it (step S307). S309). In FIG. 3, these processes are completed by time t6.
  • Step S304 is possible.
  • Step S305 and Steps S308 to S309 can be omitted, and if necessary information is stored in the memory on the terminal.
  • FIG. 3 from time t7 to time t10, an embodiment example is shown in which at least part of the application software according to an embodiment of the present invention is executed in the information processing server.
  • the user player
  • performs two typical user terminal operations such as a login operation and command transmission, and receives necessary data transmission from the information processing server or receives service provision.
  • step S311 when the user performs login processing to the server through his / her information processing terminal at time t7 (step S310), necessary authentication processing is appropriately performed in the information processing server (step S311).
  • step S312 data for the user to receive service provision is transmitted (step S312).
  • a top menu screen configured to be able to receive commands from the terminal, an application startup screen, and the like.
  • the user transmits some command via the information processing terminal (step S313).
  • This command may be a menu selection displayed on the menu screen, or may be a start command for starting an application if it is an application startup screen.
  • the server side receives this command and starts service processing (step S314).
  • step S315 a service corresponding to the request from the terminal is provided from the server (step S315).
  • a command can be transmitted from the terminal at any time after the time t10 (for example, a message transmission command or a menu selection command), and the server receives a command from the terminal each time. (For example, forwarding the received message to another terminal, analyzing the message and returning the result).
  • a user can also send a message from a user terminal to a specific partner or a specific number of partners (not shown in FIG. 3).
  • This message is relayed by the information processing server, transferred to a specific partner or a specific number of partners, and received at the other party.
  • the sent message can also be confirmed on its own terminal.
  • these message processes are realized by a functional configuration to be described later that is installed in a server or a terminal.
  • the terminal 151 is configured as a function realized by a control device 1510 and a program and data stored in the storage device 1515.
  • a talk participation unit 1511 and a message processing unit 1512 are included.
  • the talk participation unit 1511 has a function of performing processing for participation in a desired talk room. You can participate in the talk room on an individual basis as well as on a group basis.
  • the BOT server for example, the servers 110 and 120
  • the BOT server can participate in the talk room in the same manner as an individual user by providing the functions illustrated with reference to the terminal 151 (in this case, one user).
  • one account of instant messenger service is given to the BOT.
  • a new talk room can be generated.
  • the other participant terminals 2 participating in the same talk room and / or the BOT server (participating in the talk room) via the server 1 Message is sent to the BOT as a single user.
  • the message processing unit 1512 has a function of performing processing such as message generation in the talk room, transmission / reception, and display control of the transmitted / received message at the own terminal.
  • processing such as message generation in the talk room, transmission / reception, and display control of the transmitted / received message at the own terminal.
  • a received message is displayed on the left side and a transmission message is displayed on the right side with respect to the time axis from the top to the bottom.
  • the display processing unit 1513 displays the message data processed by the message processing 1512 on the display device 1518.
  • the display processing unit 1513 has a function of converting display data (for example, a character code) into pixel information (for example, a font or a pictograph) and writing the data to a frame buffer (not shown) of the display device 1518.
  • the server 110 has a talk room management unit 1111 and a message as functions realized by a control device 1110 and programs and data stored in the storage device 1105.
  • a processing unit 1112, a billing information management unit 1113, a billing processing unit 1114, a statistical processing unit 1115, and a BOT processing unit 1116 are included.
  • the talk room management unit 1111 has a function of managing talk room participants and the like.
  • the message processing unit 1112 receives a message transmitted in a specific talk room from the terminal 151, the other participant terminal (for example, the terminal 152, 153) as a destination and / or a BOT server (for example, Server 120) has a function of transmitting (transferring) the message.
  • the charging information management unit 1113 performs calculation processing and management for account charging of a user (including BOT), for example, according to provision of a message or the like (to be described later) to be charged.
  • billing processing for each user (including BOT) account is performed on the billing processing unit 1114.
  • the statistical processing unit 1115 performs statistical processing from various viewpoints based on the data processed by the billing information processing unit 1113 and / or the billing processing unit 1114, and stores and manages it.
  • the BOT processing unit 1116 performs processing for performing work, determination, advice, and the like on behalf of the user (human).
  • the present invention is not limited to these, but more specifically, a search unit 1116a for searching for appropriate data, and a language for processing a language (including a speech language) included in a transmission / reception message.
  • a processing unit 1116b, an AI processing unit 1116c that determines the meaning and value of the processed language and the like, and performs learning based on the success or failure of the determination result, an object that has been subjected to analysis processing or a transferred object
  • a processing module such as an image processing unit 1116d that performs a recognition process or the like is included.
  • the BOT server (as an example, the servers 110 and 120) can also exhibit the same functions as an individual user by providing the functions illustrated with reference to the terminal 151 (at this time, 1 A BOT as a user is given one account of an instant messenger service, similar to a human user). Then, the assistant BOT that the BOT server functions and the human user can establish a predetermined relationship with each other by, for example, forming a friend relationship (linking accounts) as in the case of human users. (The predetermined relationship is described and managed in a database or relationship table (not shown) as appropriate).
  • a user who participates in a talk room operated by a server or the like transmits / receives a message via a terminal or interacts with his / her assistant (BOT).
  • BOT his / her assistant
  • BOT her assistant
  • the server that provides the messaging service is a talk server (the server 110 in FIG. 1 corresponds to this).
  • a BOT server (such as the server 110 in FIG. 1 corresponds to this) that participates in the talk room as one user (one account) is introduced in the same manner as the user terminal.
  • FIG. 4 is a basic operation flow mainly shown from the viewpoint of one user.
  • step S401 in FIG. 4 when one user (a human user who is not a BOT) enters the talk room by starting an application via the user terminal or the like, the process proceeds to step S402, where the user is read from the talk server to the user terminal.
  • status information such as a talk list for operating the application and / or information such as a read message is read into a memory in the user terminal by reading from the storage device of the user terminal itself.
  • the talk list is a list of talk rooms operated by a plurality of talk servers. As an example, a talk room in which a user is a member enters the talk room according to a selection instruction or the like, and other member users. And a group talk. Group talks proceed by members exchanging messages (sending and receiving each other) via a messaging service provided by a talk server.
  • the loop from step S403 to step S411 is a basic operation loop, and message transmission / reception with other members can be continued unless the application is terminated (that is, exiting the talk room) in step S411. it can.
  • an assistant BOT an advisor program having a so-called AI function, which can be allowed to participate in a talk room as a user having one account
  • the assistant BOT can give advice to the user, or at least a part of a received message from another user can be transferred to a talk server or the like.
  • the relationship between the user and the assistant BOT is the same as the relationship between human users.
  • step S403 when a message is transmitted from the user by an operation (not shown) (Yes), the process proceeds to step S404, and a message transmission process to the talk server is performed (in the case of No in step S403, the process proceeds to step S405).
  • steps S404 examples of basic operations processed according to the content of the message transmitted in step S404 are as follows (1) to (3).
  • the talk server transmits (forwards) the received message to other members in the talk room and relates to the received message (that is, received).
  • One or more BOT candidates (which can respond appropriately to the message) are extracted and returned to the user who sent the message.
  • these BOT candidates include a BOT_ID associated with the BOT, a character image representing each BOT, and / or a message representing that the BOT is uttered.
  • BOT_ID (described above) is transmitted from the user terminal to the talk server
  • the talk server makes a request to the corresponding BOT (BOT server) based on the designated BOT_ID.
  • BOT server a request for a content candidate that is currently prepared by the BOT server from the talk server (that can be provided along the flow of conversation in the talk room at that time) is made to the BOT server. Is called.
  • the BOT_ID may be transmitted together with the user's message.
  • step S405 if a message or the like is received for the user (Yes), the process proceeds to step S406, and whether the assistant (or the talk group in which the user is participating) has an assistant BOT (that is, is linked). In the case of Yes, the process proceeds to step S407a, and a part or all of the message (received in step S405) is selected. Then, the process proceeds to step S407b, transferred to the talk server via the assistant BOT, and proceeds to step S408 (if No in step S405, the process proceeds directly to step S408). In steps S407a to S407b, how much of the received messages are transferred to the talk server via the assistant BOT is determined by the permission setting by the user.
  • the transfer timing is as follows: (1) every time a message is received, at least a part of the message is transferred to the talk server; (2) every time a received message is displayed on the terminal, at least a part of the message is transferred. Variations such as forwarding to a talk server can be applied.
  • a message that is not displayed on the terminal among the received messages is transferred to the talk server. It can also be controlled.
  • step S408 the received message or the like is displayed on the user terminal.
  • step S409 if an object display (described later) is received by the user (Yes), the process proceeds to step S410, and the received object is displayed as appropriate on the terminal (described later). Then, the process proceeds to step S411 (in the case of No in step S409, the process proceeds directly to step S411).
  • step S411 when an instruction to leave the talk room is given by a user operation (not shown), this flow ends (step S412).
  • FIG. 5 is a basic operation flow mainly shown from the viewpoint of the talk server.
  • the process proceeds to step S502, and various status information for operating the talk room is read into the memory in the server.
  • the loop from step S503 to step S511 is a basic operation loop, and the talk server does not include a message or the like (typically, a message including BOT_ID unless the processing is terminated in step S511. There is also a message, and there may be a message with only BOT_ID) or content (including the content itself or a designation message for designating the content) is awaited.
  • a message or the like typically, a message including BOT_ID unless the processing is terminated in step S511.
  • step S503 it is determined whether or not a message has been received. If No, the process proceeds to step S509. If Yes, the process further proceeds to step S504, where it is determined whether or not the received message includes BOT_ID. To be judged. In the case of No in step S504, since a message or the like has been received from another user, the process proceeds to step S507 to extract or specify a BOT (which may be a plurality of cases) that can respond in accordance with the received message content. Processing is performed. In one embodiment, this BOT extraction or identification process is realized by an inquiry and response from the talk server to the BOT server (s). In step S508, the BOT candidates extracted or specified in the previous step are transmitted to other users in the talk group, and the process proceeds to step SS509.
  • BOT which may be a plurality of cases
  • step S504 since the received message is typically a designation for a BOT candidate (already presented), the process proceeds to step S505, and content is designated for the designated specific BOT (server). Or send a content candidate request.
  • step S506 when a content candidate is transmitted from a specific BOT (server), the content candidate received in step S506 is transmitted to the user terminal of the message transmission source, and via this user terminal (or this user) The above-mentioned content candidates are transmitted to other users' terminals (without going through a terminal), thereby realizing real-time sharing of the contents. Then, the process proceeds to step S509.
  • step S509 it is determined whether or not the content itself or a designated message for the content has been received. If Yes, the process proceeds to step S510, and the content is transmitted to other users in the group (if No in step S509). Proceed to step S511). At this time, if the received message is a specified message to the content (for example, content information that can be obtained by specifying the URL), the content is acquired by the talk server as necessary, and this is transferred to other groups in the group. Send to user. Then, the process proceeds to step S511.
  • a specified message to the content for example, content information that can be obtained by specifying the URL
  • step S511 it is determined whether or not to end the operation as the talk server.
  • the process proceeds to step S512, and the present flow is ended.
  • the process returns to step S503.
  • Second Embodiment In the second embodiment, exchange of messages and the like between a plurality of user terminals (terminal 1 and terminal 2 as an example), a talk server, and a BOT server will be described as an operation scene of the entire system.
  • a description will be given with reference to FIG. A part or all of the contents described in the second embodiment can be applied to other embodiments except for the case where the features are mutually exclusive.
  • the operation or processing time (t1 etc.) exemplified in the embodiment is exemplified for easy understanding of the concept of the present invention, and the individual time series relationship exemplified in the embodiment of the present invention. (Hereinafter, the same applies to the embodiment that explicitly illustrates the time).
  • the user 1 creates a message at the terminal 1 (601) at time t0 and transmits it to the talk server 603. From time t1 to t2, the talk server 603 analyzes the received message and performs processing for specifying a BOT that can handle this message.
  • the BOT specifying process is realized by an inquiry process to a plurality of BOT servers that can make an inquiry from the talk server.
  • terminal 1 (601) is sent from talk server 603 to terminal 2 (602) of another user (typically, another user participating in the same talk group as the user of terminal 1).
  • the message received from is forwarded.
  • the terminal 2 (604) receives the message from the terminal 1 (601) transferred from the talk server 603, and enters the talk room of the terminal 2 by being notified of the reception, for example. Check the received message. At this time, it is possible to determine a message to be transferred to the talk server among the received messages from the terminal 1 (601) (in one embodiment of the present invention, the terminal 2 (604) is connected to the terminal 1 (601) by this determination process. ) Can be processed as if the consent to receive the message including the advice message from the assistant BOT associated with the terminal 1 (601) is obtained. At time t3, the message determined to be transferred is now transmitted (transferred) from the terminal 2 (604) to the talk server 603.
  • the talk server 603 finishes analyzing the message received from the terminal 1 (601) at time t0, and transmits BOT candidate data corresponding to the received message to the terminal 1 (601).
  • the candidate data of this BOT is exemplarily discarded by a user operation described later with reference to FIG. 7A). Note that the BOT candidate transmission timing from the talk server 603 to the terminal 1 (601) at the time t2 and the message transmission timing from the terminal 2 (604) to the talk server 603 at the time t3 are interchanged depending on the implementation status. There is.
  • the talk server 603 comprehensively analyzes messages received from the terminal 1 (601) and the terminal 2 (604) so far, and identifies BOT candidates that can handle these messages. Processing is done.
  • the BOT candidate specified by the talk server 603 is transmitted to the terminal 2 (604).
  • a BOT associated with the terminal 1 (601) in a friendship relationship
  • the BOT directly associated with the terminal 2 (604) having a direct friendship
  • the terminal 2 (604) can be displayed separately (in one embodiment, the former BOT is displayed as a normal object). And the latter BOT is subjected to mosaic processing and displayed).
  • the BOT candidate specified by the talk server 603 is transmitted to the terminal 1 (601) (in the figure, the terminal is the second flicker UI for the terminal 1 (601)).
  • these BOT candidates are displayed as a flickering UI.
  • a BOT associated with the terminal 2 (604) (having a friendship) can be included in the BOT candidates by the consent process at time t3.
  • the BOT directly associated with the terminal 1 (601) (having a direct friend relationship) and the terminal 1 (601)
  • the BOT that is not linked to the terminal 2 (604) but directly linked can be displayed separately (as one embodiment, the former BOT is displayed as a normal object). And the latter BOT is subjected to mosaic processing and displayed).
  • timing of the flicker UI for terminal 2 (604) at time t4 and the timing of the flicker UI for terminal 1 (601) at time t5 may be interchanged depending on the implementation status.
  • one of the BOT candidates presented on the terminal 1 (601) via the flicker UI is selected (that is, the user of the terminal 1 requests further information provision from the selected BOT).
  • the selected BOT identifier (BOT_ID) is transmitted from the terminal 1 (601) to the talk server 603. It should be noted that the transmission of BOT_ID can be included in the message and transmitted.
  • the talk server identifies the BOT from the received BOT_ID, and requests a message from the identified BOT (server) (time t7).
  • the request message includes further content and related content, or a request for these candidates (content candidates).
  • content candidates are returned from the BOT server 602 to the talk server 603, and at time t9, these content candidates are transferred from the talk server 603 to the terminal 1 (601).
  • the talk server 603 With the transfer of the content candidate from the talk server 603 to the terminal 1 (601) at time t9, or instead of the transfer of the content candidate from the talk server 603 to the terminal 1 (601), the talk server 603 It is also possible to broadcast content candidates to members in the talk room and share the content candidate information substantially in real time (this is an event at time t10 in FIG. 6).
  • one of the content candidates is selected at the terminal 1 (601) and transmitted to the talk server 603. Thereafter, the content is transmitted to the terminal 2 (604) at time t12 with a slight time difference.
  • the above-described content is shared on the terminal 1 (601) and the terminal 2 (604) (for example, when the shared content is a moving image content, substantial real-time playback is performed on both terminals. Made).
  • the content candidate request message is transmitted from the talk server 603 to the BOT server 602 at time t14, the content candidate is returned from the BOT server 602 to the talk server 603 at time t15, and transferred to the terminal 2 (604) at time t16. .
  • the talk server 603 With the transfer of the content candidate from the talk server 603 to the terminal 2 (604) at time t16, or instead of the transfer of the content candidate from the talk server 603 to the terminal 2 (604), the talk server 603 It is also possible to broadcast content candidates to members in the talk room and share the content candidate information substantially in real time (this is an event at time t17 in FIG. 6).
  • one of the content candidates is selected at the terminal 2 (604) and transmitted to the talk server 603. Thereafter, this content is transmitted to the terminal 1 (601) at time t19 with a slight time difference.
  • the above-described content is shared on the terminal 1 (601) and the terminal 2 (604) (for example, when the shared content is a moving image content, substantial real-time playback is performed on both terminals. Made).
  • the user 700 enters the user talk room on the display screen of the terminal 700 (the casing and the like are not shown) to exchange messages with other users in the talk group.
  • the state of doing is shown.
  • the display screen of the terminal 700 is roughly divided into a terminal status information display area 701, a messaging service providing area (hereinafter referred to as "talk room") 702, and a transmission message input and transmission area 703 from the terminal 700. ing.
  • FIG. 7A on the talk room 702, a message 7021a “Ginza” transmitted from another user at time 14:00 is displayed. Further, an assistant BOT 7021b associated with the user is displayed in the vicinity of the message 7021a. In one embodiment of the present invention, the BOT 7021b can be displayed so as to be superimposed on at least a part of the message 7021a (the same applies hereinafter).
  • BOT7021b appeared as a BOT capable of responding to the word “Ginza” (broken line portion) in the message 7021a and providing information related thereto. If the user using the terminal 700 wants to extract related information from the BOT 7021b, further information can be provided by an operation (not shown) (for example, clicking or tapping the BOT 7021b).
  • the user using the terminal 700 does not need information from the BOT 7021b, he / she wants to leave the area 702.
  • the user can leave the BOT 7021b by flicking or swiping the screen outside (edge direction) with his / her hand or finger 7021c (the BOT 7021b disappears from the screen).
  • this relationship when there is a relationship between the user using the terminal 700 and the BOT 7021b (direct or indirect friendship, etc.) along with the retreat to the BOT7021b described above, this relationship is It can also be controlled to cancel.
  • the relationship flag between the user using the terminal 700 and the BOT 7021b is reset from a friend relationship table (not shown).
  • the user using the terminal 700 transmits a message 7022a at 14:30.
  • BOT7022b (may be the same as or different from BOT7021b) is displayed in the vicinity of the message 7022a in response to the word “Roppongi” in the broken line in the message 7022a.
  • the BOT 7022b is waiting for an instruction from the user on the assumption that the related information of the keyword “Roppongi” in the message 7022a can be provided.
  • the user using the terminal 700 desires to receive further information regarding “Roppongi” from the BOT 7022b, and as an example, clicks or taps the position 7022c indicating the BOT 7022b with his / her hand or finger. .
  • option buttons 7023a to 7023c that are themes related to “Roppongi” are displayed in the vicinity of BOT 7022b. That is, the button 7023a is a selection button for obtaining further information about “rice”, the button 7023b is a selection button for obtaining further information about “sake”, and the button 7023c is for obtaining further information about “night”. This is a selection button.
  • the user using the terminal 700 presses the option button 7023c with his / her hand or finger 7021d.
  • facilities or store candidates 7024a to 7024c with the theme of “night” are displayed.
  • the user is interested in 7024b and clicks or taps position 7024d with his / her hand or finger.
  • 7025a is transmitted as a message to other members in the talk group, and the other members can know the content 7025b by receiving the message.
  • each member can obtain detailed information of the facility or the store by tapping the content 7025b.
  • FIG. 7B exemplarily illustrates how other users look on their own terminal 710 as the terminal 700 described with reference to FIG. 7A is used.
  • FIG. 7B a message 711a (corresponding to 7021a) and a message 712a (corresponding to 7022a) of another user are shown on the left side of the talk room of the terminal 710 (the casing and the like are not shown), and the vicinity On the lower side, assistant BOTs 712b, 712c, and 712d are displayed in response to the keyword “Roppongi” in these messages.
  • Each BOT has a different personality (BOT 712d can be associated with BOT 7021b and / or 7022b).
  • the time associated with the talk room is 1 for the message having the newest time.
  • the above BOT can be displayed.
  • Order when multiple BOTs appear Further, in the case where a group of BOTs appear in the vicinity of a specific message in response to the same or similar keywords as in BOT 712b, 712c, and 712d, in one embodiment, in the order extracted or specified in the talk server ( Time series) or in order of the degree of association with the keyword.
  • the BOT 7022b can be associated with the BOT 712d
  • the option buttons 7023a to 7023c can be associated with the option buttons 7131a to 7131c
  • the content candidates 7024a to 7024c can be associated with the content candidates 7132a to 7132c, respectively.
  • a message 714a including the content 714b displayed by being selected on the other user's terminal (for example, the terminal 700) is displayed at the lower left of the talk room.
  • the transmitted message 7025a can be associated with the received message 714a and the transmitted content 7025b can be associated with the received content 714b.
  • FIG. 7C shows another operation example and display example associated with operation on the terminal 720 and the like.
  • a message 7211 and a message 7212a of other users are displayed on the left side of the talk room of the terminal 720 (the casing and the like are not shown), and a plurality of BOTs are displayed in response to these message groups 721. It is displayed in the vicinity.
  • the BOTs 7212b and 7212c are also involved in the speech of the user using the terminal 720 (message on the right side of the talk room) (described later).
  • a characteristic point here is that the BOT 7212c (and BOT 722b) is semi-transparently processed on the talk room of the terminal 720 as shown in the figure. That is, these BOTs are not directly associated with the user using the terminal 720.
  • the BOT 7212c (and BOT 722b) is directly associated with another user who has a friendship with the user using the terminal 720 (for example, a user using the terminal 700). is there. Even in this case, the user using the terminal 720 is not directly associated with the user (that is, indirectly associated with the friend user) from BOT7212c (and BOT722b). Information can be received (described later).
  • BOT722b and 722c are displayed in response to a part of the message 722a ("DANCE" in the figure), but the BOT722b is directly linked to itself. Since this is not done, a translucent process has been performed (however, as described above, advice can also be obtained from this BOT 722b).
  • a user using the terminal 720 taps the BOT 722c at the position 722d and requests provision of related information regarding the dance.
  • the user using the terminal 720 transmits the message 723, and transmits the message 724 after a while (as shown in the drawing, for example, about 30 minutes).
  • the BOT of the same character as the BOT 722c is displayed in the vicinity of the message 724 (the lower left corner of the message in the figure), but as shown in the drawing, the appearance is slightly different from the BOT 722c displayed about 30 minutes ago. (As an example, it seems busy.)
  • the BOT displayed in this way can be given a facial expression to express that a little time is required for information collection or the degree of the time.
  • the user using the terminal 720 transmits the message 725a without waiting for advice information from the BOT 722c.
  • BOT 725b is displayed in the lower left corner of the message 725a (for example, this BOT is the same as BOT 722b and not directly linked to itself), and the user taps BOT 725b with his / her hand or finger 725c. And ask for advice.
  • FIG. 7D shows the terminal 730 (this terminal may be the same as or different from the terminal 720, but may be different from the operation accompanying the operation on the terminal 720 described with reference to FIG. 7C.
  • 720 is the same as 720) and an operation example and a display example accompanying the above operation. That is, the messages 731, 732, 733 a, and 735 correspond to the messages 7211, 7212 a, 722 a, and 724 in FIG. 7C, respectively.
  • the talk room area 734 is an overlay display, and is displayed in response to the BOT 722c tap (DANCE related information request) in FIG. 7C.
  • buttons 7341a, 7341b, and 7341c related to DANCE are displayed. Further, when the user using the terminal 730 presses the button 7341a (for example, taps the position 7341d), it relates to the dance video. Content candidates 7342a, 7342b, and 7342c are displayed.
  • the user using the terminal 730 selects the content candidate 7342b by tapping the position 7342d and displays the content 736 in a reproducible state in the lower right corner of the talk room.
  • FIG. 7E is an example of a screen on the terminal 720 after a series of operations described with reference to FIGS. 7C and 7D (this screen is described as a display example on the terminal 740).
  • the message 741 displayed in the talk room is the message 7211, 731
  • the message 742 is the message 7212a, 732
  • the message 743 is the message 722a, 733a
  • the message 744 is the message 723 (corresponding to FIG. 7D).
  • Message 745 corresponds to messages 724 and 735
  • message 746 corresponds to message 725a (the corresponding message is hidden in the overlay in FIG. 7D)
  • content 747 corresponds to content 736.
  • the message group shown in FIG. 7E also serves as a history of exchanges so far, and the BOT displayed in each message corner remains displayed in the vicinity of the corresponding message, but the present invention is not limited to these. BOTs that have been displayed for a certain period of time after being displayed are automatically deleted, or BOTs that have been displayed for a certain period of time have been displayed for BOTs that have received a request from the user once It can be left as it is or various variations can be adopted.
  • ⁇ First Modification> 8A (A) to 8 (C) illustrate variations in how to display the assistant BOT as a character.
  • the shields 801a, 801b, and 801c in the figure may be physical shields such as corners on the screen, or may be software shields such as messages and display objects on the screen.
  • the shields 801a, 801b, and 801c in the figure may be physical shields such as corners on the screen, or may be software shields such as messages and display objects on the screen.
  • the BOT 810a shown in FIG. 8A (A) is a character object having a chick as an example, and only the face of the shield 801a is looked about half. That is, in FIG. 9A, control is performed so that only a part of the object image representing the BOT 810a is touched by the user.
  • the BOT 810b (which is the same character object as the BOT 810a) shown in FIG. 8A (B) has almost all of its face removed from the shield 801b. That is, in FIG. A (B), a part of the object image representing BOT 810a (but more than that in FIG. 8A (A)) is controlled so as to touch the user's eyes.
  • the BOT 810c shown in FIG. 8A (C) (which is the same character object as the BOTs 810a and 810b) is all removed from the shielding object 801c. That is, in FIG. 3A (C), the entire object image representing the BOT 810a is controlled so as to touch the user's eyes.
  • the following technical significance can be given as an example depending on how much of the object image representing the BOT is to be touched by the user (object display ratio).
  • the quality of information that can be provided from the BOT and the object expression ratio are associated with a proportional relationship, or the amount of information that can be provided and the object expression ratio are associated with a proportional relationship. it can.
  • the above display mode can be applied in order to express a guideline for the time from the request for information provision to the BOT until the reply.
  • the information provision time and the object display ratio are inversely related (that is, when the object display ratio is 100%, it is possible to respond quickly and indicate that the BOT itself is present).
  • the object display ratio is 20%, it is expected that it will take time to reply, and it shows that BOT is somewhat unconfident as BOT). This time can be reflected by calculating the load on the server.
  • FIGS. 8B (A) to 8 (C) illustrate variations in how to display the assistant BOT as a message box in the same manner as a human user.
  • Shields 802a, 802b, and 802c in the figure may be physical shields such as corners on the screen, or may be software shields such as messages and display objects on the screen.
  • the message box 820c shown in FIG. 8B (C) is controlled so that the entire message box 820c can be seen from the shield 802c.
  • the third modification illustrated with reference to FIG. 8C is still another variation of the first to second modifications.
  • the quality of information that can be provided from the BOT and the size of the message object can be associated with a proportional relationship, or the amount of information that can be provided and the size of the message object can be associated with a proportional relationship.
  • the above display mode can also be applied to express a guideline for the time from requesting information provision to BOT until reply.
  • FIG. 8D (A) to (C) illustrate variations of how to display the assistant BOT as a message box in the same manner as a human user.
  • the messages 840a, 840b, and 840c in the figure are illustratively different in the darkness of the lines.
  • various technical significances can be given to the line density, background color, etc. of the message object (as described with reference to the above table as an example).
  • the fifth modified example illustrated with reference to FIG. 9 mainly shows a variation in the case where a plurality of assistant BOTs specifically described with reference to FIGS. 7B to 7E are displayed in the vicinity of a specific message or the like. Especially, the variation in the case of changing the display mode depending on the relationship between the user and the BOT is shown.
  • the assistant BOTs 901a, 902a, and 903a shown in FIG. 9A are displayed side by side in the vicinity of a specific message, for example, BOT712b, 712c, and 712d.
  • the BOTs 901a, 902a, and 903a are directly associated with the terminal user displaying them. In this case, as shown in the figure, display control is performed so that no distinction is made in the way of displaying the assistant BOTs 901a, 902a, and 903a (display mode).
  • the BOTs 901b, 902b, and 903b shown in FIG. 9B the BOTs 901b and 902b are directly associated with the terminal user that displays them, but the BOT 903b is indirect. It is assumed that the relationship (for example, BOT 903b is directly associated with another user in the talk room). In this case, as shown in the figure, the assistant BOTs 901b, 902b, and 903b are displayed in different ways (display modes), and the BOT 903b is controlled to be translucent.
  • the BOT 901c is directly associated with the terminal user that displays them, but the BOTs 902c and 903c have an indirect relationship. It shall remain at Also in this case, as shown in the figure, the assistant BOTs 901c, 902c, and 903c are displayed in different ways (display modes), and the BOTs 902c and 903c are controlled to be translucent.
  • the assistant BOTs 901d, 902d, and 903d shown in FIG. 9D correspond to the BOTs 901c, 902c, and 903c, respectively, and show further variations in how to distinguish display modes. That is, display control is performed so that the BOTs 902c and 903c are translucent, but the display control is performed on the BOTs 902d and 903d so that mosaic processing is performed based on the same purpose.
  • the sixth modified example illustrated with reference to FIG. 10 displays the busyness of the assistant BOT 722c specifically described with reference to FIG. 7C (longer response time based on the amount of processing, etc.). The variation of the case is shown.
  • the BOT shown in FIG. 10A is a display mode when a normal response time can be expected.
  • BOT shown in FIG. 10 (B) is a display mode when a longer response time is expected than in the case shown in FIG. 10 (A).
  • the BOT shown in FIG. 10B is a display mode in the case where a longer response time is expected than in the case shown in FIG.
  • the BOT shown in FIG. 10D represents a state where the remote operation is performed as shown in the figure.
  • the BOT is not manually responding by a computer program but is manually supported by a human (human). It is an example of a display mode for showing the user that there is.
  • the fourth embodiment is an embodiment in which charging processing is introduced in the first to third embodiments.
  • the basic concept will be described with reference to FIGS.
  • a part or all of the contents described in the fourth embodiment can be applied to other embodiments except when the features are mutually exclusive.
  • the typical timing of charging based on the above embodiments is as follows.
  • (B1) the assistant BOT is presented or provided (B2) a plurality of rough themes for the assistant BOT When requesting / providing selectable information or menu, etc.
  • (B3) When requesting / providing content candidates along one or more themes according to the theme, (B4) Requesting one content among the content candidates / When provided, etc.
  • the assistant BOT as exemplified in (B1) is the first object or first information
  • the selectable information or menu as exemplified in (B2) or the content candidate as exemplified in (B3) is the second object.
  • the provided content as exemplified in the second information (B4) is referred to as the third object or the second information.
  • FIG. 11 shows a basic flow of billing processing in the system or the like according to one embodiment of the present invention.
  • step S1101 of FIG. 11 it is assumed that an application (typically, a messaging service) has already been activated.
  • step S1102 the first information presentation (first object presentation) by the assistant BOT is performed at various timings shown in the above-described embodiment, and the process proceeds to step S103.
  • a weight count process is performed for charging due to the first information being presented by BOT in the previous step.
  • the point of processing here is that the billing process can be performed each time the first information is presented, but in some cases, as an example, there is a process to perform a stepwise billing process according to the number of times the first information is presented. Is that it is done. For example, in step S1103, the number of times the first information is presented by the BOT based on the previous step is counted, and when a certain period (for example, one week or one month) has passed, or a certain number of times (for example, 1000 or 10,000). Billing process can be performed when the count is counted. It should be noted that a known technique can be appropriately applied to the billing process itself.
  • step S1104 it is determined whether there is a second information presentation (second object presentation) request to the assistant BOT by the user via various scenes shown in the above-described embodiment. If No, the process skips to step S1110. However, the double of Yes advances to step S1105, and the 2nd information presentation by BOT is made with respect to a user terminal.
  • second information presentation second object presentation
  • step S1106 a weight count process is performed for charging due to the second information presented by the user in the previous step.
  • the point of processing here is that the billing process can be performed each time the second information is presented, but in some cases, as an example, there is a process to perform a stepwise billing process according to the number of times the second information is presented. Is that it is done.
  • step S1107 it is determined whether or not there is a third information presentation (third object presentation) request to the assistant BOT by the user through various scenes shown in the above-described embodiment. If No, the process skips to step S1110. However, the double of Yes advances to step S1108, and the 3rd information presentation by BOT is made with respect to a user terminal.
  • third information presentation third object presentation
  • step S1109 a weight count process is performed for charging due to the third information presented by the user in the previous step.
  • the point of processing here is that the billing process can be performed each time the third information is presented, but in some cases, as an example, there is a process for performing a stepwise billing process according to the number of times the third information is presented. Is that it is done.
  • Step S1110 is the end of this flow, but as long as the application is operating, the processing from step S1102 to step S1109 described above is repeated.
  • the seventh modified example described with reference to FIG. 12 is an extension of the conventional counting process in steps S1103, S1106, and S1109 in FIG. 11, and charging of the same content or the like by a plurality of members in the talk group is performed. Adjustment logic is implemented.
  • step S1202 When the pay-as-you-go process for providing information or the like is started in step S1201 in FIG. 12, the process proceeds to step S1202, the information presentation destination (user) account is specified, and the process further proceeds to step S1203. Identified.
  • step S1202 may be omitted.
  • step S1204 it is determined whether the information related to information provision, which is the basis for starting this flow, is information already requested or provided in the account group specified in the previous step. If No in step S1204, the process skips to step S1206, but if Yes, the process proceeds to step S1205, and the duplication flag is turned on (the duplication flag has been initialized at an appropriate timing not shown).
  • step S1206 it is determined whether or not the duplication flag is on. If Yes, the process skips to step S1208, but if No, the process proceeds to step S1207, and the metering count process is performed.
  • a specific example of the metering count processing is as described with reference to FIG.
  • Step S1208 is the end of this flow. However, as long as the application is operating, the above-described processing from step S1202 to step S1207 is performed in the extended metering count processing.
  • the operation flow of FIG. 12 means that, as an example, the assistant BOT object is displayed in response to a message, and the assistant BOT is selected by the first user in the talk group, and then the assistant BOT is selected in the talk group.
  • the billing information for the BOT provider according to the selection by the second user is not generated (control to avoid duplicate billing within the group).
  • the assistant BOT is displayed in response to a certain message and selected by the first user in the talk group.
  • the billing to the BOT provider by the selection by the second user is the billing by the selection by the first user. Is to be performed separately (control that places importance on charging for each account).
  • the billing information for the BOT providing user's BOT is viewed.
  • a browsing screen is displayed, a terminal status information display area 1301, a BOT information display area 1302, a charging fee confirmation / change area 1303 for presenting first information, etc., and a charging fee confirmation / change area for presenting second information, etc. 1304, a charging fee confirmation / change area 1305 for presenting third information, etc., and a payment means guidance area 1306.
  • a target BOT name and a BOT character image are displayed so that the BOT can be identified at a glance.
  • the number of points per view is displayed as the billing charge unit price by the first information presentation.
  • the point is a virtual currency value when accumulating the number of views (presentation number).
  • the screen changes to another screen (not shown) as an example, and the charging conditions (by default, the charging conditions at the time of application) can be changed as appropriate.
  • the charging fee confirmation / change area 1304 by the second information presentation and the charging fee confirmation / change area 1305 by the third information presentation have the same configuration as the area 1303.
  • the setting change buttons 1304b and 1305c are displayed.
  • the button is pressed, the screen transitions to another screen (not shown) as an example, and the billing conditions for presenting each information (by default, the billing conditions at the time of application) can be changed as appropriate.
  • the user can easily recognize that he / she has been charged (or that a point for charging processing has been added).
  • the billing information for the BOT of the BOT provider user is viewed.
  • the browsing screen is displayed. More specifically, a terminal status information display area 1501, a BOT information display area 1502, billing information (subtotal column) 1503, the same (tax) 1504, and the same (total) 1505 at the present time are displayed.
  • Breakdown column for information presentation including a detailed display button 1506a
  • Breakdown column for second information presentation including a detailed display button 1506b
  • Breakdown column for third information presentation including a detailed display button 1506c
  • a target BOT name and a BOT character image are displayed so that the BOT can be identified at a glance.
  • the current billing information (subtotal field) 1503 for example, the current billing amount (displayed by tax) of the billing month is displayed.
  • the current billing information (tax) 1504 displays the billing amount (tax display) at the present time of the billing month as an example.
  • the current billing information (total) 1505 for example, the current total billed amount (displayed with tax) of the billing month is displayed.
  • the breakdown column by the first information presentation shows that the first information presentation has occurred 8089 times at the present time in the billing month.
  • the detail display button 1506a when the detail display button 1506a is pressed, it is possible to view a statistical breakdown by transitioning to another screen (not shown) as an example.
  • the breakdown column by the second information presentation indicates that the second information presentation has occurred 1607 times at the present time in the billing month.
  • the detailed display button 1506b is pressed, it is possible to transition to another screen (not shown) as an example and browse the statistical breakdown.
  • the breakdown column by the third information presentation shows that the third information presentation has occurred 1008 times at the present time in the billing month.
  • the detail display button 1506c is pressed, it is possible to view a statistical breakdown by transitioning to another screen (not shown) as an example.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)
PCT/JP2018/010218 2017-03-15 2018-03-15 Bot制御管理プログラム、方法、装置、及びシステム WO2018168998A1 (ja)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-050419 2017-03-15
JP2017050419A JP6929670B2 (ja) 2017-03-15 2017-03-15 プログラム、情報処理方法、端末

Publications (1)

Publication Number Publication Date
WO2018168998A1 true WO2018168998A1 (ja) 2018-09-20

Family

ID=63523833

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/010218 WO2018168998A1 (ja) 2017-03-15 2018-03-15 Bot制御管理プログラム、方法、装置、及びシステム

Country Status (2)

Country Link
JP (3) JP6929670B2 (enrdf_load_stackoverflow)
WO (1) WO2018168998A1 (enrdf_load_stackoverflow)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111124319A (zh) * 2018-10-30 2020-05-08 富士施乐株式会社 信息处理装置、存储介质及信息处理方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7388665B2 (ja) * 2022-02-08 2023-11-29 グリー株式会社 情報処理システム、情報処理方法、情報処理プログラム
JP7421762B1 (ja) 2022-09-27 2024-01-25 グリー株式会社 情報処理システム、制御方法、及びサーバ装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014142919A (ja) * 2013-01-22 2014-08-07 Nhn Business Platform Corp マルチユーザメッセンジャーサービスを提供する方法およびシステム
JP2017027443A (ja) * 2015-07-24 2017-02-02 Line株式会社 コンテンツ識別子による別のコンテンツを提供するシステム及びその方法
US20170337209A1 (en) * 2016-05-17 2017-11-23 Google Inc. Providing suggestions for interaction with an automated assistant in a multi-user message exchange thread

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003099382A (ja) 2001-09-20 2003-04-04 Sharp Corp コミュニケーションシステム及び情報処理プログラムを記録した記録媒体
JP2004152057A (ja) 2002-10-31 2004-05-27 Hitoshi Kimura ビデオチャット等の課金システム
JPWO2004057507A1 (ja) 2002-12-20 2006-04-27 富士通株式会社 利用者位置による課金システム
KR101266141B1 (ko) 2011-10-11 2013-05-21 (주)카카오 인스턴트 메시징 서비스 제공 방법 및 그 제공 시스템
JP6681146B2 (ja) 2015-03-31 2020-04-15 Line株式会社 情報処理装置、情報処理方法、及びプログラム
US11477139B2 (en) 2016-02-25 2022-10-18 Meta Platforms, Inc. Techniques for messaging bot rich communication

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014142919A (ja) * 2013-01-22 2014-08-07 Nhn Business Platform Corp マルチユーザメッセンジャーサービスを提供する方法およびシステム
JP2017027443A (ja) * 2015-07-24 2017-02-02 Line株式会社 コンテンツ識別子による別のコンテンツを提供するシステム及びその方法
US20170337209A1 (en) * 2016-05-17 2017-11-23 Google Inc. Providing suggestions for interaction with an automated assistant in a multi-user message exchange thread

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111124319A (zh) * 2018-10-30 2020-05-08 富士施乐株式会社 信息处理装置、存储介质及信息处理方法

Also Published As

Publication number Publication date
JP7458458B2 (ja) 2024-03-29
JP2021185501A (ja) 2021-12-09
JP2018156183A (ja) 2018-10-04
JP2022190108A (ja) 2022-12-22
JP7171854B2 (ja) 2022-11-15
JP6929670B2 (ja) 2021-09-01

Similar Documents

Publication Publication Date Title
JP6760797B2 (ja) プログラム、情報処理方法、及び端末
US10187484B2 (en) Non-disruptive display of video streams on a client system
JP7458458B2 (ja) プログラム、方法、端末
CN114009056B (zh) 具有在人与内容之间的适应性图形关联的动态可伸缩概要
US12284414B2 (en) Method for sending gift in live streaming room, method for displaying gift in live streaming room, and related device
EP3942396A1 (en) Contextually-aware control of a user interface displaying a video and related user text
CN107404428A (zh) 一种基于聊天群的服务方法、装置以及电子设备
KR20150037941A (ko) 협업 환경 및 뷰
JP7130719B2 (ja) コンピュータプログラム、方法、及び、サーバ装置
JP2016126790A (ja) ユーザカスタム型テンプレートを提供するメッセンジャサービスを提供する方法、システムおよび記録媒体
CN114443201A (zh) 消息展示方法、装置、设备及存储介质
WO2025036235A1 (zh) 一种直播互动方法、装置、设备及介质
WO2024222192A1 (zh) 信息处理方法、装置、电子设备及存储介质
WO2025113305A1 (zh) 信息展示方法、装置、设备及存储介质
JP2019102001A (ja) プログラム、情報処理方法、及び情報処理装置
JP2021002699A (ja) 動画配信システム、情報処理方法およびコンピュータプログラム
JP6902133B2 (ja) 動画配信システム、情報処理方法およびコンピュータプログラム
JP7338935B2 (ja) 端末の表示方法、端末、端末のプログラム
WO2024222594A1 (zh) 信息处理方法、装置、电子设备及存储介质
CN115334028B (zh) 表情消息的处理方法、装置、电子设备及存储介质
JP2018156184A (ja) Bot制御管理プログラム、方法、装置、及びシステム
CN115270734B (zh) 一种基于会话的消息处理方法、装置、设备及介质
CN115907713A (zh) 一种日程签到方法、装置、设备及介质
JP2018156185A (ja) Bot制御管理プログラム、方法、装置、及びシステム
JP2019109651A (ja) プログラム、情報処理方法、及び情報処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18768586

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18768586

Country of ref document: EP

Kind code of ref document: A1