WO2020013060A1 - 双方向映像通信システム及びそのオペレータの管理方法 - Google Patents
双方向映像通信システム及びそのオペレータの管理方法 Download PDFInfo
- Publication number
- WO2020013060A1 WO2020013060A1 PCT/JP2019/026526 JP2019026526W WO2020013060A1 WO 2020013060 A1 WO2020013060 A1 WO 2020013060A1 JP 2019026526 W JP2019026526 W JP 2019026526W WO 2020013060 A1 WO2020013060 A1 WO 2020013060A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- operator
- terminal
- user
- information
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
Definitions
- the present disclosure relates to a two-way video communication system for two-way communication between an image of a user operating a kiosk terminal and an image of an operator operating an operator terminal between a kiosk terminal and an operator terminal, and a method of managing the operator. Things.
- an operator who performs a telephone response service at a call center may be subject to high stress by responding to complaints and inquiries from users (customers and the like). Therefore, by monitoring the physical condition of such an operator and instructing the operator to take a break or switching the response to the user to another operator based on the monitoring result, the stress of the operator is reduced.
- Technology is being developed.
- the call is transferred to the back reception unit, where the time corresponding to the user in advance is set.
- a technique for instructing an operator to take a break when it is determined that the time is equal to or shorter than a predetermined allowable time is known (see Patent Document 1).
- stress determination information is formed from the physical state of the operator measured during the response of the user, and a target is determined based on the stress determination information.
- a technique of determining an operator terminal to which a new incoming call is sorted see Patent Document 2.
- Patent Literatures 1 and 2 since it is necessary to provide a measuring device for measuring a physical condition (for example, blood pressure, pulse, sweating, etc.) for each operator, a system is required. In addition to the complicated structure, there is a problem that the cost of the system increases.
- a measuring device for measuring a physical condition for example, blood pressure, pulse, sweating, etc.
- the present disclosure has as its main object to provide a two-way video communication system and a management method of the operator that can grasp the stress state of each operator while suppressing an increase in system cost. .
- the interactive video communication system is configured such that a kiosk terminal operable by a user, an operator terminal operable by an operator, and a management server are communicably connected to each other, and a communication between the kiosk terminal and the operator terminal is performed.
- a two-way video communication system that bidirectionally communicates an image of a user operating the kiosk terminal and an image of an operator operating the operator terminal, wherein the kiosk terminal has a first camera that shoots a user.
- a first communication unit that transmits the image of the user captured by the first camera to the operator terminal and receives the image of the operator transmitted from the operator terminal; and a first communication unit that displays the image of the operator.
- a display unit wherein the operator terminal has a second camera for photographing the operator, and the second camera A second communication unit that transmits the shaded image of the operator to the kiosk terminal and receives the image of the user transmitted from the kiosk terminal, and a second display unit that displays the image of the user.
- the management server may acquire, for each operator operating the operator terminal, vital information based on at least the image of the operator captured by the second camera.
- An operator management method is configured such that a kiosk terminal operable by a user, an operator terminal operable by an operator, and a management server are communicably connected to each other, and a communication between the kiosk terminal and the operator terminal is performed.
- a method of managing an operator in a two-way video communication system that bidirectionally communicates a video of a user operating the kiosk terminal and a video of an operator operating the operator terminal, wherein the operator terminal includes the kiosk terminal and Wherein the management server acquires vital information based on the operator's image captured by the operator terminal.
- the present disclosure in the two-way video communication system and the method of managing the operator thereof, it is possible to grasp the stress state of each operator while suppressing an increase in system cost.
- FIG. 2 is a block diagram showing a schematic configuration of the kiosk terminal 1 and the operator terminal 2 Block diagram showing a schematic configuration of the management server 3 Explanatory diagram showing a screen displayed on the kiosk terminal 1 Explanatory diagram showing a screen displayed on the kiosk terminal 1 Explanatory diagram showing a screen displayed on the operator terminal 2 Explanatory drawing about the vital information generation processing by the operator terminal 2 Explanatory drawing about the vital information generation processing by the operator terminal 2 Explanatory drawing showing an example of vital information generated by the operator terminal 2 Flow chart showing the flow of vital information generation processing by the operator terminal 2 Explanatory diagram showing a first display example of a management screen for displaying operator management information Explanatory drawing showing a second display example of a management screen displaying operator management information Explanatory drawing showing a third display example of a management screen displaying operator management information Explanatory diagram showing a fourth display example of a management screen displaying operator management
- a kiosk terminal operable by a user, an operator terminal operable by an operator, and a management server are communicably connected to each other, and the kiosk terminal and the operator A two-way video communication system for bidirectionally communicating a video of a user who operates the kiosk terminal and a video of an operator who operates the operator terminal with the terminal, wherein the kiosk terminal captures an image of the user A first camera, a first communication unit that transmits an image of the user photographed by the first camera to the operator terminal, and receives an image of the operator transmitted from the operator terminal; A first display unit for displaying an image, wherein the operator terminal has a second camera for photographing the operator, and the second camera A second communication unit that transmits the image of the operator photographed by the kiosk terminal to the kiosk terminal and receives the image of the user transmitted from the kiosk terminal, and a second display unit that displays the image of the user.
- the management server obtains, for each operator operating the operator terminal, at least vital information based on an image of the operator captured by the
- vital information based on an image of an operator captured by a camera (second camera) used for two-way video communication with a kiosk terminal is acquired, so that the vital information is suppressed while suppressing an increase in system cost. From this, it is possible to grasp the stress state of each operator.
- the second invention is characterized in that the management server further obtains information on the time corresponding to the user for each operator who operates the operator terminal.
- the system administrator can grasp the factors affecting the stress state of each operator based on the information on the response time to the user.
- the third invention is characterized in that the management server further acquires, for each operator operating the operator terminal, information on the stress state of the operator estimated based on the vital information.
- the system administrator can easily grasp the stress state of each operator based on the information on the stress state of the operator estimated based on the vital information.
- the information on the stress state includes an evaluation value of the stress state of the operator and a threshold value preset for the evaluation value of the stress state.
- the system administrator can more appropriately grasp the stress state of each operator based on the evaluation value of the stress state of the operator and a preset threshold value.
- the fifth invention is characterized in that the management server further acquires information on the emotion of the user based on the video of the user.
- the system administrator can grasp the factors affecting the stress state of each operator based on the information on the emotion of the user corresponding to the operator.
- the management server further acquires vital information based on the video of the user captured by the first camera of the kiosk terminal.
- the system administrator can grasp the factors that affect the stress state of each operator while suppressing an increase in the cost of the system, based on the vital information based on the video of the user corresponding to the operator. .
- the management server simultaneously displays information on the stress state for a plurality of operators who have operated the operator terminal.
- the system administrator can quickly grasp the stress state of a plurality of operators.
- An eighth invention is characterized in that the management server simultaneously displays information on the stress state of the operator estimated based on the vital information and information on the emotion of the user based on the video of the user. .
- the system administrator can more properly grasp the factors affecting the stress state of each operator based on the information on the stress state of the operator and the information on the emotion of the user.
- the second camera includes a front camera for photographing the face of the operator, and a hand camera for photographing the hand of the operator, wherein the management server is photographed by the front camera. Acquiring the vital information based on the image of the operator.
- the tenth invention is characterized in that the vital information includes at least one of a pulse wave and a heartbeat.
- the system administrator can easily grasp the stress state of each operator based on at least one of the pulse wave and the heartbeat.
- an eleventh invention is characterized in that a kiosk terminal operable by a user, an operator terminal operable by an operator, and a management server are communicably connected to each other, and the kiosk terminal and the operator terminal A method for managing an operator in a two-way video communication system for two-way communication between a video of a user operating a kiosk terminal and a video of an operator operating the operator terminal, wherein the operator terminal has both the kiosk terminal and the operator terminal.
- An image of the operator for performing directional video communication is captured, and the management server acquires vital information based on the image of the operator captured by the operator terminal.
- vital information based on an image of an operator used for two-way video communication with a kiosk terminal is acquired, so that the stress state of each operator is grasped from the vital information while suppressing an increase in system cost. Becomes possible.
- FIG. 1 is an overall configuration diagram of the interactive video communication system according to the present embodiment.
- This interactive video communication system includes a kiosk terminal 1, an operator terminal 2, and a management server 3.
- the kiosk terminal 1, the operator terminal 2, and the management server 3 are mutually connected via a network such as the Internet, a VPN (Virtual Private Network), or an intranet.
- a network such as the Internet, a VPN (Virtual Private Network), or an intranet.
- the kiosk terminal 1 is installed in various facilities and operated by an arbitrary user.
- the kiosk terminal 1 transmits an image of the user (hereinafter, simply referred to as “user image”) to the operator terminal 2, and transmits the image to the operator terminal 2. Displays the operator's image received from.
- the operator terminal 2 is installed in a facility where an operator responding to the user resides, such as a call center, and is operated by the operator.
- An image of the operator (hereinafter, simply referred to as “operator image”) is a kiosk terminal 1. And displays the video of the user received from the kiosk terminal 1.
- the management server 3 is installed in a facility used by a system administrator, such as a management headquarters of an interactive video communication system, and is composed of a management computer operated by the administrator.
- the management server 3 can receive the operator management information for managing the stress state of the operator from the kiosk terminal 1 and the operator terminal 2, respectively. Can also be generated.
- the management server 3 can acquire, for example, the vital information of the operator generated based on the video of the operator captured by each operator terminal 2 as the operator management information. Further, the management server 3 acquires, as operator management information, information on the emotion of the user generated based on the video of the user captured by each kiosk terminal 1 (video of the user to which the operator to be managed is responding). be able to.
- the kiosk terminal 1 can provide various services. For example, by installing the kiosk terminal 1 in a lobby of a transportation facility such as an airport, it is possible to provide services such as guidance of nearby sightseeing spots, guidance of floors in the facility, and guidance of nearby accommodation facilities. it can. In addition, by installing the kiosk terminal 1 in a store of a financial institution such as a bank, it is possible to provide various services performed at a counter or the like, for example, services such as account opening and consultation on financial transactions and loan contracts. By installing the kiosk terminal 1 at the front of an accommodation facility such as a hotel, it is possible to provide various guidance services provided by a clerk (concierge). By installing the kiosk terminal 1 in the entrance lobby of a condominium such as an apartment, various services provided by a manager can be provided.
- the management server 3 can acquire various types of information communicated between the kiosk terminal 1 and the operator terminal 2 as necessary.
- confidential information for example, personal information such as a user's name and address, and information such as an account number of a financial institution may be exchanged.
- confidential information since the service provider has already operated a dedicated network with high security, information other than video including confidential information is communicated on the existing network, and The video may be communicated over another network.
- FIG. 2 is a perspective view illustrating an appearance of the kiosk terminal 1.
- the kiosk terminal 1 includes a housing 11, a front monitor 12, a hand monitor 13, a front camera 14, a hand camera 15, an IC card reader 16, a speaker 17, and a microphone 18.
- the front monitor 12 is arranged with the screen facing forward, and the hand monitor 13 is arranged with the screen facing upward.
- the hand monitor 13 has a touch panel, and a user can perform a screen operation.
- the front camera 14 photographs the upper body including the user's face from the front.
- the hand camera 15 photographs the hand of the user, that is, the hand of the user placed on the hand monitor 13 and the screen of the hand monitor 13 from above. The user performs an operation of pointing the screen of the hand monitor 13 with his / her hand, and this situation is photographed by the hand camera 15.
- the IC card reader 16 reads an IC card possessed by the user.
- the speaker 17 outputs the voice uttered by the operator.
- the microphone 18 picks up a voice (user voice) emitted by the user.
- the kiosk terminal 1 configured as described above is placed on a table such as a counter, and the user operates the kiosk terminal 1 while sitting on a chair or while standing.
- FIG. 3 is a perspective view illustrating an appearance of the operator terminal 2.
- the operator terminal 2 includes a gantry 21, a first monitor 22, a second monitor 23, a front camera 24, a hand camera 25, a headset 26, and a table 27.
- the first monitor 22 is supported by the gantry 21 so as to have a predetermined height.
- the second monitor 23 includes a touch panel, and allows an operator to perform screen operations.
- the front camera 24 photographs the upper body including the operator's face from the front.
- the hand camera 25 captures the operator's hand, that is, the operator's hand placed on the table 27 and the table 27 from above.
- the operator places a document such as a pamphlet on the table, explains the document while pointing the document at his / her hand, and this situation is photographed by the camera at hand 25.
- the headset 26 includes a speaker 28 and a microphone 29.
- the speaker 28 outputs a voice uttered by the user.
- the microphone 29 picks up the sound emitted by the operator.
- the operator terminal 2 is provided with a monitor 5.
- the monitor 5 displays a screen of an application started by the operator terminal 2 or a PC (not shown).
- the screen of this application is shared with the kiosk terminal 1, and the same screen is displayed on the monitor 13 of the kiosk terminal 1 (screen sharing function).
- the monitor 5 has a touch panel, and allows the operator to draw on the screen by hand (whiteboard function).
- the operator can use the operator terminal 2 to perform a telephone service that responds to the user only by voice, in addition to a face-to-face service that responds to the user with video and audio.
- a monitor (not shown) for telephone service is provided in the operator terminal 2.
- FIG. 4 is a block diagram illustrating a schematic configuration of the kiosk terminal 1 and the operator terminal 2.
- the kiosk terminal 1 includes the front monitor 12, the hand monitor 13, the front camera 14, the hand camera 15, the IC card reader 16, the speaker 17, and the microphone 18 as described above.
- the kiosk terminal 1 includes a control unit 31, a communication unit 32, and a storage unit 33.
- the communication unit 32 communicates with the operator terminal 2 and the management server 3 via a network.
- the storage unit 33 stores a program executed by a processor constituting the control unit 31.
- the storage unit 33 stores necessary information such as information on the user's emotion generated by the emotion information generation unit 36 based on the video of the user.
- the control unit 31 includes a screen control unit 35, an emotion information generation unit 36, a voice control unit 37, and a voice conversion unit 38.
- the control unit 31 is configured by a processor, and each unit of the control unit 31 is realized by executing a program stored in the storage unit 33 by the processor.
- the screen control unit 35 controls the display screens of the front monitor 12 and the hand monitor 13. In the present embodiment, when receiving the front image of the operator from the operator terminal 2, the screen control unit 35 displays the front image of the operator on the front monitor 12. In addition, when receiving the operator's hand image from the operator terminal 2, the screen controller 35 displays the operator's hand image on the hand monitor 13.
- the screen control unit 35 When the screen control unit 35 receives the caption character information from the operator terminal 2, the screen control unit 35 can generate a caption image and display the caption image superimposed on the front moving image of the operator. When receiving the guide information from the operator terminal 2, the screen control unit 35 generates an image of the band information in which the guide information is visualized, and superimposes the band information image on the front moving image of the operator and displays the image. Can be.
- the emotion information generation unit 36 generates information on the emotion of the user.
- the emotion information generation unit 36 calculates the evaluation value of the user's emotion as information related to the user's emotion by performing the face recognition processing of the user in the user's video based on a known method. For example, when calculating the evaluation value of the user's emotion, the emotion information generation unit 36 calculates the user's face feature information, that is, a plurality of feature points set on the face, from the user's front video taken by the front camera 14. It is possible to extract the user's emotions (in this embodiment, sum of the four emotions of anger, sadness, surprise, and joy) based on the extracted feature points, it can. In addition, the emotion information generation unit 36 can also generate information on the user's emotion based on the user's voice (pitch, frequency, and the like) collected by the microphone 18 based on a known method.
- the information on the user's emotion generated by the emotion information generation unit 36 is transmitted to the management server 3 and stored in the management server 3 as operator management information together with vital information from the operator terminal 2 and the like.
- the audio control unit 37 controls the audio output from the speaker 17.
- the voice control unit 37 determines whether one of the original voice of the operator received from the operator terminal 2 and the voice converted by the voice conversion unit 38 according to whether or not the voice conversion is valid. Is output from the speaker 17.
- the voice conversion unit 38 can convert the original voice of the operator received from the operator terminal 2 into voice of a desired voice quality.
- a well-known voice conversion technique such as voice conversion using deep learning may be used.
- control unit 31 performs connection control for connecting to the operator terminal 2 and transmits / receives a video of the user captured by the kiosk terminal 1 and a video of the operator captured by the operator terminal 2 in real time. Video transmission control.
- the operator terminal 2 includes the first monitor 22, the second monitor 23, the front camera 24, the hand camera 25, and the headset 26. Further, the operator terminal 2 includes a control unit 41, a communication unit 42, and a storage unit 43.
- the communication unit 42 communicates with the kiosk terminal 1 and the management server 3 via a network.
- the storage unit 43 stores a program executed by a processor included in the control unit 41. Further, the storage unit 43 stores necessary information such as vital information of the operator generated based on an image of the operator.
- the control unit 41 includes a screen control unit 45, a vital information generation unit 46, a voice recognition unit 47, and a corresponding time calculation unit 48.
- the control unit 41 is configured by a processor, and each unit of the control unit 41 is realized by executing a program stored in the storage unit 43 by the processor.
- the screen control unit 45 controls display screens of the front monitor 12 and the hand monitor 13 of the kiosk terminal 1.
- the screen control unit 45 includes, as screen controls related to the hand monitor 13 of the kiosk terminal 1, an operator display mode for displaying an operator's hand image, an operation screen mode for displaying an operation screen (such as a menu screen), Switch between the screen sharing mode for displaying the application screen.
- the display mode of the front monitor 12 and the hand monitor 13 of the kiosk terminal 1 is switched according to the operation contents of the user at the kiosk terminal 1, but the display mode may be selected by the operator. Good.
- the vital information generator 46 generates vital information of each operator. As will be described in detail later, the vital information generating unit 46 executes a process (vital information generating process) of extracting a pulse wave and a heartbeat as vital information of each operator from the front image of the operator captured by the front camera 24. Thus, the vital information generation unit 46 can appropriately acquire vital information from a relatively stable (less moving) image of the operator captured by the front camera 24.
- the vital information generating unit 46 can also execute the vital information generating process by using an image of the operator's hand photographed by the hand camera 25 instead of the front image of the operator, if necessary. In some cases, the vital information generating unit 46 may execute the vital information generating process using both the front image of the operator and the image at hand of the operator.
- the voice recognition unit 47 performs voice recognition on the voice of the operator collected by the microphone 29 and outputs character information.
- the response time calculation unit 48 calculates the response time of the operator to the user (the time corresponding to each user).
- the calculation of the response time can be performed, for example, based on the input value of the response time for each user by each operator. Alternatively, the corresponding time for each user may be calculated (estimated) from the shooting time of the video of each user.
- the information on the response time to the user calculated by the response time calculator 48 is transmitted to the management server 3. Note that the management server 3 may have the same function as the management module 48, omitting the management module 48.
- control unit 41 performs connection control for connecting to the kiosk terminal 1 and transmits / receives a video of the user captured by the kiosk terminal 1 and a video of the operator captured by the operator terminal 2 in real time. Video transmission control.
- the operator terminal 2 may be provided with a scanner for reading documents on hand. Further, the operator terminal 2 may be provided with an IC card reader in order to authenticate that the operating person is a legitimate operator. In addition, the kiosk terminal 1 may be provided with a printer that prints out documents transmitted from the operator terminal 2 and information displayed on the screen.
- the second monitor 23 may be configured by a tablet PC, that is, the control unit 41, the communication unit 42, and the storage unit 43 may be housed in the housing of the second monitor 23.
- FIG. 5 is a block diagram illustrating a schematic configuration of the management server 3.
- the management server 3 includes a control unit 81, a communication unit 82, a storage unit 83, and a monitor 84.
- the control unit 81 is configured by a processor, and various processes including display control of the management screen in the management server 3 are realized by executing a program stored in the storage unit 83 by the processor.
- the communication unit 82 communicates with the kiosk terminal 1 and the operator terminal 2 via a network.
- the storage unit 83 stores a program to be executed by a processor constituting the control unit 81.
- the storage unit 83 also stores user vital information received from the operator terminal 2 and information about the user's emotion received from the kiosk terminal 1 as operator management information.
- the information obtained from the kiosk terminal 1 and the information obtained from the operator terminal 2 are associated with each other based on the time information added thereto and information such as a user ID and an operator ID.
- the monitor 84 displays necessary information such as operator management information for the system administrator or the like to manage the stress state of the operator, as described later in detail.
- FIG. 6 and 7 are explanatory diagrams showing screens displayed on the kiosk terminal 1.
- the front monitor 12 operates as a digital signage, and as shown in FIG. The video of the content related to the guidance of the advertisement and the facility is displayed.
- a main menu screen (operation screen) is displayed on the hand monitor 13 as shown in FIG. 6 (A-2).
- operation buttons 51 for selecting a service menu are displayed.
- the user can select “procedure” and “consultation” as the service menu.
- consultation the mode is changed to the operator display mode, and a transition is made to an operator screen (see FIGS. 7A-1 and 7A-2).
- "Consultation” is a case in which the user consults on a loan or trust contract, etc., and detailed guidance is required and it takes time. The operator responds to the user on the operator screen.
- the “procedure” refers to a case where the user performs an account opening procedure or the like, and requires only a simple screen operation, and usually does not require the operator to provide face-to-face guidance.
- a call button 52 is displayed on the main menu screen of the monitor 13 at hand.
- the kiosk terminal 1 is connected to the operator terminal 2 and then enters the operator display mode, and the display transits to the operator screen (see FIGS. 7A-1 and 7A-2). .
- the user can be guided by the operator even in the case of “procedure” requiring only a simple operation.
- a screen for inquiring the user as to whether or not to interact with the operator is displayed before transitioning to the operator screen, and the screen is switched to the operator screen when an operation performed by the user is performed. You may.
- a transition to a submenu screen may be made as necessary, as shown in FIG. 6 (B-2).
- operation buttons 53 corresponding to various submenu items are displayed.
- a call button 52 is displayed on this sub-menu screen, similarly to the main menu screen (see FIG. 6A-2).
- the front monitor 12 displays the front image 61 of the operator photographed by the front camera 24 of the operator terminal 2. Is displayed, and at the same time, as shown in FIG. 7 (A-2), a hand image 62 of the operator captured by the hand camera 25 of the operator terminal 2 is displayed on the hand monitor 13.
- a screen of an application started on the operator terminal 2 or a PC (not shown) on the operator side is displayed on the monitor 13 at hand.
- the screen of this application is shared with the operator terminal 2, and the same screen is displayed on the monitor 5 of the operator terminal 2 (screen sharing function).
- the user can draw on the screen by handwriting (whiteboard function).
- FIG. 8 is an explanatory diagram showing a screen displayed on the operator terminal 2.
- a standby screen is displayed on the first monitor 22 during standby, and when the user operates the call button 52 (see FIGS. 6A-2 and 6B-2) on the kiosk terminal 1, the standby screen is displayed.
- FIG. 8 (A-1) an incoming call screen is displayed.
- information about the kiosk terminal 1 to be connected such as an installation location and a terminal name) is displayed.
- an operation screen is displayed on the second monitor 23 as shown in FIG. 8 (A-2).
- various operation buttons 71 for controlling the operator terminal 2 and giving instructions to the kiosk terminal 1 are displayed.
- an operator's front image 61 shot by the front camera 24 of the operator terminal 2 and an operator's hand image 62 shot by the hand camera 25 of the operator terminal 2 are displayed.
- the front image 61 and the hand image 62 of the operator are the same as those displayed on the kiosk terminal 1.
- the operator's hand image 62 can be switched between an original state and an inverted state.
- a front image 72 of the user captured by the front camera 14 of the kiosk terminal 1 is displayed on the first monitor 22 as shown in FIG. Is done.
- the first monitor 22 is supported by the gantry 21 so as to have a predetermined height (see FIG. 3), so that the operator and the user can match their eyes.
- the operation button 71 is displayed on the second monitor 23 as in the standby state.
- the front image 61 of the operator is displayed on the second monitor 23 as in the case of standby.
- the operator's front image 61 can be switched to the operator's hand image.
- the second monitor 23 displays a user's hand image 73 captured by the hand camera 15 of the kiosk terminal 1 in a state where the operator's hand image is displayed. Note that the user's hand image 73 can be switched between an original state and an inverted state.
- the user's hand image 73 displayed on the second monitor 23 shows the user's finger pointing to a document such as a pamphlet reflected on the screen on the hand monitor 13 of the kiosk terminal 1, and the user and the operator point to each other while pointing to the document. Can interact.
- the user's front image 72 is displayed on the first monitor 22 and the user's hand image 73 is displayed on the second monitor 23. You may make it display on one monitor. In this case, it is possible to realize a sense of reality as if the operator is facing the user through the counter.
- FIG. 9 and FIG. 10 are explanatory diagrams relating to vital information generation processing by the operator terminal 2.
- the vital information generating unit 46 detects the detected face region as the operator's skin region by executing a face detection process based on a known statistical learning method using the feature amount of the face on the image of the operator. Then, tracking is performed, and information on the skin area (the pixel value and the number of pixels constituting the skin area, etc.) is acquired. For the detection of the skin region, a face detection process based on a known pattern recognition method (for example, matching with a prepared template) is used in addition to a known statistical learning method using a face feature amount. Is also good.
- the vital information generator 46 calculates the operator's pulse based on the obtained information on the skin area. More specifically, for each pixel constituting a skin region extracted in a video (frame image) of a temporally continuous operator, for example, pixel values (0 to 255 gradations) of RGB components are calculated. Time series data of a representative value (here, an average value of each pixel) is generated as a pulse signal. In this case, time-series data can be generated based on the pixel values of only the green component (G), which is particularly large in fluctuation due to pulsation.
- G green component
- the generated time-series data of the pixel value is a minute change (for example, a pixel value of less than one tone) based on a change in hemoglobin concentration in blood. Fluctuation). Therefore, the vital information generation unit 46 performs a known filter process (for example, a process using a band-pass filter with a predetermined pass band set) on the time-series data based on the pixel value, thereby obtaining the data shown in FIG. As shown in (B), a pulse wave from which a noise component has been removed can be extracted as a pulse signal.
- a known filter process for example, a process using a band-pass filter with a predetermined pass band set
- the vital information generating unit 46 calculates a pulse wave interval (RRI) from the time between two or more adjacent peaks in the pulse wave, for example, as shown in FIG. can do. Further, as shown in FIG. 10B, the RRI can be obtained as a change with time of the RRI.
- the shooting time (or time) of the video of the operator is associated with the frame image, the vital information extracted from the frame image is also associated with the shooting time.
- the vital information generation unit 46 extracts a physiological or neurological activity index of the operator from the acquired vital information (RRI).
- the activity index include RRI, SDNN which is the standard deviation of RRI, RMSSD or pNN50 which is an index of heart rate and vagal tone, and LF / HF which is an index of stress.
- the vital information generation unit 46 can estimate the physical state (stress state (tensity), concentration, drowsiness, etc.) of the operator based on these activity indices. For example, changes in RRI over time have been found to reflect sympathetic and parasympathetic activity. Therefore, it is possible to estimate the physical state based on the change over time of the RRI shown in FIG. 10B, that is, the fluctuation of the RRI.
- the vital information generation unit 46 calculates an evaluation value of the stress state of the operator based on the acquired activity index of the operator.
- the vital information generation unit 46 can use LF / HF, which is an index of stress, as the stress state evaluation value.
- the vital information generation unit 46 may calculate the evaluation value of the stress state from at least one of the above-mentioned activity indices. In this case, the vital information generation unit 46 prepares, in advance, determination information indicating the relationship between the temporal change of the activity index acquired by a learning method or an experimental method and the stress state, and refers to the determination information.
- the evaluation value of the stress state can be calculated.
- FIG. 11 is an explanatory diagram showing an example of vital information generated by the operator terminal 2 (vital information generating unit 46).
- the vital information 91 includes an operator ID number 92, a shooting time 93 of a frame image (elapsed time from a predetermined time in this case), and an RRI value 94 at each shooting time 93.
- the operator ID number 92 (in this example, ID: OP1) is assigned to identify the operator.
- the photographing time 93 is the elapsed time from the start of photographing by the operator. In the example of FIG. 11, when the shooting time 93 is “0.782”, “1.560”, “2.334”,..., The RRI 94 is “0.782”, “0.778”, “0”. .774 ", ...
- FIG. 12 is a flowchart showing the flow of vital information generation processing by the operator terminal 2.
- the operator terminal 2 loads the temporally continuous image of the operator photographed by the front camera 24 and information on the photographing time into the memory (ST101). Therefore, the operator terminal 2 detects the operator's skin region from the acquired operator's video (frame image) (ST102), and extracts the operator's vital information based on the time-series data of the skin region (ST103). Note that the extracted vital information is stored in the storage unit 43.
- the operator terminal 2 acquires correspondence information (user ID, conversation content, etc.) relating to the user to which the operator corresponds (ST104), and associates this correspondence information with vital information (ST105).
- correspondence information user ID, conversation content, etc.
- the operator terminal 2 extracts a physiological or neurological activity index of the operator from the vital information extracted in step ST103 (ST106). Subsequently, the operator terminal 2 determines a stress state of the operator (calculates an evaluation value of the stress state) based on the extracted activity index (ST107). Information on the determined stress state is stored in storage section 43 (ST108).
- ⁇ Information on the stress state stored in the storage unit 43 is transmitted from the communication unit 42 to the management server 3 (ST109).
- step ST109 in order to display the vital sensing screen 131 as shown in FIG. 16, the vital information corresponding to the stress state is transmitted to the management server 3 together with the information relating to the stress state.
- the operator terminal 2 transmits the vital information calculated in step ST103 to the management server 3, and the management server 3 executes processing corresponding to steps ST106 to ST108 in place of the operator terminal 2, thereby obtaining a stress state. May be calculated.
- the correspondence information on the user acquired in step ST104 can be associated with the vital information transmitted to the management server 3.
- the management server 3 has the same function as the vital information generation unit 46 is also possible.
- an image of the operator is acquired from the operator terminal 2, and the management server 3 performs the same processing as ST102 to ST108.
- the management server 3 uses the information on the stress state based on the image of the operator captured by the camera of the operator terminal 2 used for the two-way video communication with the kiosk terminal 1 (vital information). Is obtained or calculated, it is possible to grasp the stress state of each operator from the vital information while suppressing an increase in the cost of the system.
- FIGS. 13 to 17 are explanatory diagrams showing first to fifth display examples of management screens each including operator management information.
- the management server 3 can display a management screen for checking the stress state of each operator on the monitor 84 based on at least the operator management information obtained from the operator terminal 2 in response to a request from the administrator. However, the management server 3 transmits display data for displaying the same management screen to another PC (management PC) connected via the network in response to a request from the administrator. You can also.
- PC management PC
- a selection area 102 is provided for the administrator to select an operator (here, one operator) whose stress state is to be checked.
- FIG. 13 shows a case where the operator OP1 is selected, and the main screen 101 shows a stress state 103 of the operator OP1, a conversation content 104 about the user, and a user's response record 105.
- the operator OP2 is checked, the operator OP1 is unchecked, and on the main screen 101, the stress state 103 of the operator OP2, the conversation contents 104 about the user, and the user's response record 105 can be viewed.
- the stress state 103 indicates how the evaluation value (vertical axis) of the stress state changes with the lapse of time (horizontal axis) of one day's work, and the evaluation value of the stress state fluctuates in real time.
- ⁇ Conversation content 104 ⁇ indicates various ratios (pie graphs) of conversations between the operator and the user.
- the conversation type for example, “normal conversation”, “positive conversation”, “negative conversation”, etc., which have different effects on the stress state of the operator, can be set.
- the type data of the conversation content is obtained by starting up a screen (not shown) for selecting one from a plurality of type candidates at the operator terminal at the timing when the operator finishes the correspondence with the user and prompting the operator to select one. Can be linked with the corresponding information on the user.
- the user response record 105 indicates the number of users (vertical axis) that the operator corresponds to each month (horizontal axis).
- the administrator presses the statistical information 1 button 110 for displaying the first statistical information on the main screen 101, and then selects one or more operators to display the first statistical information as shown in FIG.
- the screen 111 can be displayed.
- the first statistical information screen 111 shown in FIG. 14 shows a case where four operators OP1 to OP4 are selected on the main screen 101 and the statistical information 1 button 110 is pressed.
- the corresponding time (vertical axis) for the corresponding user here, customer AD: horizontal axis
- the evaluation value of the emotion here, the average of the evaluation value of the emotion: vertical axis
- the evaluation value of the stress state when the operator responds to the user here, the average of the evaluation value of the stress state: vertical axis
- the other operators OP2 to OP4 shown in FIGS. 14B to 14D are the same as the operator OP1.
- a threshold value 112 indicating an upper limit of the evaluation value is set in advance for the evaluation value of the stress state.
- the first statistical information screen 111 shows a state in which the evaluation value of the stress state of the operator OP4 has exceeded the threshold value 112 when dealing with the customer MP, whereby the administrator can take measures against the operator OP4. (For example, letting the operator OP4 take a break or a vacation) can be recognized.
- a graph relating to the operator OP4 whose stress state evaluation value exceeds the threshold value 112 is highlighted (highlighted), whereby the administrator can evaluate the stress state of the operator OP4. It is possible to quickly pay attention to the value.
- a different threshold value 112 may be set for each operator in consideration of the stress tolerance of the individual.
- the administrator checks on the first statistical information screen 111 the response time to the user who may be a factor of the stress state and the evaluation value of the user's emotion. By doing so, it is possible to grasp (or estimate) the factors that have influenced the stress state of the operator OP4. As a result, the administrator can take more appropriate measures for the operator OP4.
- the administrator presses the statistical information 2 button 120 for displaying the second statistical information on the main screen 101 (see FIG. 13), and then selects one or more operators, as shown in FIG.
- Such a second statistical information screen 121 can be displayed.
- the second statistical information screen 121 shown in FIG. 15 shows an example in which all operators OP1 to OP10 and the like are selected on the main screen 101 and the statistical information 2 button 120 is pressed.
- the corresponding time, the evaluation value of the emotion of the user, and the evaluation value of the stress state of the operator shown in FIG. 14 are displayed for a predetermined period (for example, one month) for each operator shown on the horizontal axis. Are shown (here, average values).
- two thresholds of the evaluation value are set in advance for the evaluation value of the stress state. Accordingly, when the evaluation value of the stress state of the operator exceeds the second threshold value 123, the administrator may need to take a measure for the operator (or may take a preliminary measure). If the operator's stress state evaluation value exceeds the first threshold value 122, the operator (here, the operator OP4) can take immediate action. Can be recognized as necessary.
- information on the stress state can be displayed simultaneously (within the same screen) for a plurality of operators (more preferably, all operators) operating the operator terminal 2.
- the administrator can quickly grasp the stress state of a plurality of operators.
- the administrator since the information on the stress state of the operator and the information on the emotion of the user can be displayed simultaneously (within the same screen), the administrator can determine the factors affecting the stress state of each operator. It becomes possible to grasp more appropriately.
- the administrator can display a vital sensing screen 131 as shown in FIG. 16 by selecting any operator on the main screen 101 (see FIG. 13) and pressing the vital information button 130. .
- the vital sensing screen 131 shown in FIG. 16 data relating to the vital information of the operator which is the basis of the evaluation value of the stress state is shown.
- information 132 indicating the face detection position of the operator information 133 indicating the skin region detected by the operator, information 134 of the pulse of the operator, and information 135 of the temporal change of the heart rate of the operator are displayed.
- information 137 of a temporal change of pNN50 heart rate with an RR interval of 50 milliseconds or more
- the administrator can confirm from the vital sensing screen 131 that vital information from the operator has been normally acquired.
- the administrator can display a user emotion estimation screen 141 as shown in FIG. 17 by selecting any operator on the main screen 101 (see FIG. 13) and pressing the user information button 140. it can.
- the graph 143 showing the evaluation value of the user's emotion includes the user's anger, sadness, surprise, Estimated values relating to the four emotion levels of joy are shown, and the sum of these estimated values corresponds to the evaluation value of the emotion of the user shown in FIGS. 14 and 15.
- Each emotion is given a predetermined weight. For example, when the estimated value related to the degree of anger of the user is large, the evaluation value of the emotion of the user is large, and the influence on the stress on the operator tends to be large. It is in.
- a two-way video communication system and an operator management method thereof according to the present disclosure have an effect of being able to grasp a stress state of each operator while suppressing an increase in system cost, and have a kiosk terminal and an operator terminal. It is useful as a two-way video communication system for two-way communication between a video of a user operating a kiosk terminal and a video of an operator operating an operator terminal, and a method of managing the operator.
- Kiosk terminal 2 Operator terminal 3: Management server 5: Monitor 11: Housing 12: Front monitor (first display unit) 13: Hand monitor 14: Front camera (first camera) 15: Hand camera 16: IC card reader 17: Speaker 18: Microphone 21: Mount 22: First monitor (second display unit) 23: second monitor 24: front camera (second camera) 25: Hand camera 26: Headset 27: Table 28: Speaker 29: Microphone 31: Control unit 32: Communication unit (first communication unit) 33: storage unit 35: screen control unit 36: emotion information generation unit 37: voice control unit 38: voice conversion unit 41: control unit 42: communication unit (second communication unit) 43: storage unit 45: screen control unit 46: vital information generation unit 47: voice recognition unit 48: response time calculation unit 51: operation button 52: call button 53: operation button 61: operator's front image 62: operator's hand image 63: hand image 71: operation button 72: user's front image 73: user's hand image 81: control unit 82: communication unit 83: storage unit 84: monitor 91: vital information
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Economics (AREA)
- Operations Research (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Development Economics (AREA)
- Educational Administration (AREA)
- Game Theory and Decision Science (AREA)
- Telephonic Communication Services (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018129994A JP7153888B2 (ja) | 2018-07-09 | 2018-07-09 | 双方向映像通信システム及びそのオペレータの管理方法 |
| JP2018-129994 | 2018-07-09 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020013060A1 true WO2020013060A1 (ja) | 2020-01-16 |
Family
ID=69142925
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/026526 Ceased WO2020013060A1 (ja) | 2018-07-09 | 2019-07-03 | 双方向映像通信システム及びそのオペレータの管理方法 |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP7153888B2 (enExample) |
| WO (1) | WO2020013060A1 (enExample) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021200189A1 (ja) * | 2020-03-31 | 2021-10-07 | ソニーグループ株式会社 | 情報処理装置、および情報処理方法、並びにプログラム |
| JP2022082304A (ja) * | 2020-11-20 | 2022-06-01 | 株式会社日立ソリューションズ・クリエイト | 作業従事者管理支援システムおよび作業従事者管理支援処理方法 |
| WO2022180854A1 (ja) * | 2021-02-26 | 2022-09-01 | 株式会社I’mbesideyou | ビデオセッション評価端末、ビデオセッション評価システム及びビデオセッション評価プログラム |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7519283B2 (ja) | 2020-12-08 | 2024-07-19 | 株式会社日立ソリューションズ・クリエイト | 不正操作検知システム |
| JP7544344B2 (ja) * | 2021-01-14 | 2024-09-03 | 株式会社サテライトオフィス | オペレーターシステム、オペレーターシステムのプログラム |
| JP7179384B1 (ja) | 2021-12-07 | 2022-11-29 | 株式会社Abelon | サーバ、情報処理方法、およびプログラム |
| WO2025187228A1 (ja) * | 2024-03-08 | 2025-09-12 | 日本電気株式会社 | 情報処理装置、情報処理方法およびプログラム |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009533998A (ja) * | 2006-04-17 | 2009-09-17 | シティバンク エヌ. エイ. | ビデオ通信のための方法ならびにシステム |
| JP2009294647A (ja) * | 2008-05-09 | 2009-12-17 | Agi:Kk | 行動分析装置及びコールセンターシステム |
| JP2012195863A (ja) * | 2011-03-17 | 2012-10-11 | Oki Networks Co Ltd | コールセンタシステム、コールセンタサーバ及びコールセンタプログラム、並びに、自動着信呼分配装置 |
| JP2016529422A (ja) * | 2013-08-12 | 2016-09-23 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | ルーム編制システム |
| JP2018010608A (ja) * | 2016-07-13 | 2018-01-18 | 横河電機株式会社 | 制御システム用のコンテキストベースのオペレータ支援の方法およびシステム |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4703516B2 (ja) * | 2006-08-29 | 2011-06-15 | 株式会社富士通エフサス | コールセンタにおけるオペレータ管理システム及びコールセンタにおけるオペレータ管理方法 |
-
2018
- 2018-07-09 JP JP2018129994A patent/JP7153888B2/ja active Active
-
2019
- 2019-07-03 WO PCT/JP2019/026526 patent/WO2020013060A1/ja not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009533998A (ja) * | 2006-04-17 | 2009-09-17 | シティバンク エヌ. エイ. | ビデオ通信のための方法ならびにシステム |
| JP2009294647A (ja) * | 2008-05-09 | 2009-12-17 | Agi:Kk | 行動分析装置及びコールセンターシステム |
| JP2012195863A (ja) * | 2011-03-17 | 2012-10-11 | Oki Networks Co Ltd | コールセンタシステム、コールセンタサーバ及びコールセンタプログラム、並びに、自動着信呼分配装置 |
| JP2016529422A (ja) * | 2013-08-12 | 2016-09-23 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | ルーム編制システム |
| JP2018010608A (ja) * | 2016-07-13 | 2018-01-18 | 横河電機株式会社 | 制御システム用のコンテキストベースのオペレータ支援の方法およびシステム |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021200189A1 (ja) * | 2020-03-31 | 2021-10-07 | ソニーグループ株式会社 | 情報処理装置、および情報処理方法、並びにプログラム |
| JP2022082304A (ja) * | 2020-11-20 | 2022-06-01 | 株式会社日立ソリューションズ・クリエイト | 作業従事者管理支援システムおよび作業従事者管理支援処理方法 |
| JP7572843B2 (ja) | 2020-11-20 | 2024-10-24 | 株式会社日立ソリューションズ・クリエイト | 作業従事者管理支援システムおよび作業従事者管理支援処理方法 |
| WO2022180854A1 (ja) * | 2021-02-26 | 2022-09-01 | 株式会社I’mbesideyou | ビデオセッション評価端末、ビデオセッション評価システム及びビデオセッション評価プログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2020009177A (ja) | 2020-01-16 |
| JP7153888B2 (ja) | 2022-10-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2020013060A1 (ja) | 双方向映像通信システム及びそのオペレータの管理方法 | |
| KR100884297B1 (ko) | 전자 회의 지원 방법 및 전자 회의 시스템에서의 정보 단말장치 | |
| KR102165699B1 (ko) | 사용자 맞춤형 실시간 피부질환 관리시스템 | |
| KR20140061620A (ko) | 증강 현실을 활용한 소셜 네트워크 서비스 제공 시스템 및 방법과, 디바이스 | |
| CN107918726A (zh) | 距离感应方法、设备及存储介质 | |
| JP7400886B2 (ja) | ビデオ会議システム、ビデオ会議方法、およびプログラム | |
| US20200413009A1 (en) | Bidirectional video communication system and kiosk terminal | |
| JP6517480B2 (ja) | 接客管理システム、顧客端末、サーバー装置、接客管理方法、及び接客方法 | |
| CN114363547A (zh) | 一种双录装置、双录交互控制方法 | |
| CN114188027A (zh) | 基于智能设备的健康数据处理方法、装置、终端及介质 | |
| JP7206741B2 (ja) | 健康状態判定システム、健康状態判定装置、サーバ、健康状態判定方法、及びプログラム | |
| JP6473531B1 (ja) | 顔認証技術による自動割り勘決済システム | |
| JP2024075597A (ja) | 情報処理装置 | |
| JP7110669B2 (ja) | ビデオ会議システム、ビデオ会議方法、およびプログラム | |
| JP2025011478A (ja) | バイタル情報管理装置、バイタル情報管理システム及びバイタル情報管理方法 | |
| TWI750922B (zh) | 用於會客導引的人臉辨識系統及方法 | |
| JP2019149625A (ja) | 双方向映像通信システム及びオペレータ端末 | |
| KR101189913B1 (ko) | 고객 관리 방법 및 장치 | |
| CN115171284A (zh) | 一种老年人关怀方法及装置 | |
| JP6932120B2 (ja) | 通知装置及び通知方法 | |
| JP2010122881A (ja) | 端末装置 | |
| JP7631628B1 (ja) | 業務支援システム、及び、携帯デバイス | |
| KR20170064730A (ko) | Vr 기기를 활용한 가상 고객 서비스 방법 및 시스템 | |
| JP7278010B1 (ja) | 複数の作業者が連携して業務を行うことを支援する方法、プログラム及び業務支援装置 | |
| JP5277235B2 (ja) | 情報処理端末およびプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19834549 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19834549 Country of ref document: EP Kind code of ref document: A1 |