CN109896365B - Guidance system - Google Patents

Guidance system Download PDF

Info

Publication number
CN109896365B
CN109896365B CN201811354581.3A CN201811354581A CN109896365B CN 109896365 B CN109896365 B CN 109896365B CN 201811354581 A CN201811354581 A CN 201811354581A CN 109896365 B CN109896365 B CN 109896365B
Authority
CN
China
Prior art keywords
person
attendance
entrance
building
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811354581.3A
Other languages
Chinese (zh)
Other versions
CN109896365A (en
Inventor
小林敬幸
高野安司
本桥弘光
大竹晋资
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Building Systems Co Ltd
Original Assignee
Hitachi Building Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Building Systems Co Ltd filed Critical Hitachi Building Systems Co Ltd
Publication of CN109896365A publication Critical patent/CN109896365A/en
Application granted granted Critical
Publication of CN109896365B publication Critical patent/CN109896365B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Elevator Control (AREA)

Abstract

A guidance system is provided with: an entrance person determination unit that determines whether or not building entrance person input information matches worker information operating in the building, i.e., an entrance person determination; an attendance management unit that changes an attendance state of an entering person to an attendance state of each worker in the building when the result of the judgment by the entering person is true; a conversation control unit that performs a conversation in which one or more questions are presented to an entrance from a physical or virtual robot and one or more answers including the name of the entrance are received from the entrance by the robot; a meeter specifying unit that specifies a meeter of an entrance person based on the contents of the conversation; an attendance determination unit that determines whether the attendance state of the identified meeter is attendance, that is, an attendance determination; and a guide unit which, when the attendance determination result is true, gives the entry person to the elevator capable of transporting the entry person to the destination floor, i.e., the meeter floor, i.e., the service elevator, and introduces the entry person from the robot to the destination floor.

Description

Guidance system
Technical Field
The invention relates to a system for guiding the entrances to a building.
Background
Due to the recent progress of robot control technology, a robot replaces a conventional operation performed by human power. For example, according to patent document 1, an autonomous traveling physical robot takes in a car and operates a destination floor button after a landing call of an elevator is made, and guides an entering person (elevator user) to a destination.
Sometimes into buildings with offices, such as office buildings or complex office buildings, for meeting. However, the entrant cannot meet the meeter smoothly. As an example of the reason for this, a case may be considered in which the entrance person does not know which elevator is used to reach the floor where the meeting person meets, does not know which floor the meeting person meets, or forgets the name of the meeting person. As a countermeasure, it is conceivable to provide a reception person who checks an entrance and a meeting person (for example, attendance state of the meeting person) and guides the entrance or provide a crew member in the car of the elevator on the way from the entrance of the building to the elevator hall. However, it is troublesome to confirm the entrant and the meeter by human power, guide the entrant to the elevator, and guide the entrant to the destination floor.
In the technique of patent document 1, guidance of stores in commercial facilities can be expected instead of human labor. However, the technique of patent document 1 cannot be applied to guidance of an entrance person of a building having an office. Patent document 1 neither discloses nor suggests a technique for confirming an entrance person and a meeter.
Patent document 1: japanese patent laid-open No. 2008-162758.
Disclosure of Invention
In order to solve the above problem, a guidance system according to the present invention includes: an entrance person determination unit that determines whether or not an entrance person identified based on input information of an entrance person of a building is a person who works in the building, that is, an entrance person determination unit; an attendance management unit which changes an attendance state of an entering person to an attendance state of each worker in the building when the result of the judgment of the entering person is true; a conversation control unit that performs a conversation in which one or more questions are presented to an entrance person from a physical or virtual robot and one or more answers including the name of the entrance person are received from the entrance person by the robot; a meeter specifying unit that specifies a meeter of the person who enters the venue based on the contents of the conversation; an attendance determination unit that determines whether the attendance state of the identified meeter is attendance, that is, an attendance determination; and a guide unit that introduces an elevator and a destination floor from the robot to the entrance, the elevator and the destination floor being capable of transporting the entrance to the destination floor, which is the floor of the meeter, when the attendance determination result is true.
According to the present invention, when a meeter goes out, it is possible to guide an entry person to the floor of the meeter without manpower.
Drawings
Fig. 1 shows an example of the arrangement of a virtual robot.
Fig. 2 is a block diagram of a boot system.
Fig. 3 is a structural diagram of a virtual robot.
Fig. 4 is a configuration diagram of the robot controller.
Fig. 5 is a structural diagram of a robot monitoring center.
Fig. 6 is a configuration diagram of the attendance management apparatus.
Fig. 7 is a structural view of the guide device.
Fig. 8 is a configuration diagram of a monitoring camera control device.
Fig. 9 shows a modification of the virtual robot.
Fig. 10 is a structural diagram of a building summary table.
Fig. 11 is a structural diagram of an office problem table.
Fig. 12 is a structural diagram of a calendar.
Fig. 13 is a configuration diagram of a store area table.
Fig. 14 is a structural diagram of a staff table.
Fig. 15 is a structural view of an elevator meter.
Fig. 16 is a flowchart of the attendance status update process.
Fig. 17 is a flowchart of the entrance guidance process.
Description of the symbols
100: virtual robot, 130: attendance management device, 150: guide device, 160: a monitoring camera control device.
Detailed Description
In the following description, the "interface unit" may be one or more interfaces. The one or more interfaces may include one or more interface devices and one or more communication interface devices for one or more I/O devices. The one or more communication Interface devices may be one or more of the same kind of communication Interface device (e.g., one or more NICs (Network Interface Card)), or 2 or more of different kinds of communication Interface devices (e.g., NICs and HBAs (Host Bus Adapter)).
In the following description, the "memory unit" may be one or more memories. The memory section is typically a main storage device. The at least one memory may be volatile memory or non-volatile memory. The memory unit is mainly used for processing by the processor unit.
In the following description, the "PDEV unit (Physical Device Physical equipment unit)" may be one or more PDEVs. The PDEV portion is typically an auxiliary storage device. "PDEV" denotes a physical storage device, typically a non-volatile storage device, such as an HDD (Hard Disk Drive) or SSD (Solid State Drive).
In the following description, the "storage unit" includes at least a memory unit and a memory unit in the PDEV unit.
In the following description, the "processor unit" may be one or more processors. The at least one processor is typically a microprocessor such as a CPU (Central Processing Unit), but may be another type of processor such as a GPU (Graphics Processing Unit). At least one processor may be single core or multi-core. A portion of the processor may be a hardware Circuit (e.g., an FPGA (Field-Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit)) that performs a portion or all of the processing.
In the following description, although functions will be described by expressions of the "kkk unit" (excluding the interface unit, the storage unit, and the processor unit), the functions may be realized by the processor unit executing one or more computer programs, or may be realized by one or more hardware circuits. In the case where the functions are realized by the processor section executing the program, the predetermined processing is performed by using the storage section and/or the interface section as appropriate, and therefore the functions may be at least a part of the processor section. The processing described with the function as a subject may be processing performed by the processor unit or a device having the processor unit. The program may also be installed from a program source. The program source may be, for example, a program distribution computer or a computer-readable recording medium (e.g., a non-transitory storage medium). The description of each function is an example, and a plurality of functions may be combined into one function or one function may be divided into a plurality of functions.
In the following description, although information is described by the expression of the "xxx table" in some cases, the information may be expressed by an arbitrary data structure. That is, the "xxx table" can be referred to as "xxx information" in order to indicate that the information does not depend on the data structure. In the following description, the structure of each table is an example, and one table may be divided into 2 or more tables, and all or a part of the 2 or more tables may be one table.
Hereinafter, a mode for carrying out the present invention will be described in detail with reference to the accompanying drawings.
Fig. 1 shows an example of the installation of a virtual robot.
The virtual robot 100 is installed in an elevator hall 191 of a building and guides an entrance person to a desired floor. In order to effectively utilize the time for an entrance to wait for an elevator in the elevator hall 191, the virtual robot 100 is provided in the elevator hall 191 in the present embodiment. However, the virtual robot 100 may be installed in a place other than the elevator hall 191, for example, in the car of an elevator as shown in fig. 9. In addition, a physical robot may be used instead of or in addition to the virtual robot 100.
Fig. 2 shows a system configuration of the guidance system 170 according to the present embodiment.
The building 190 includes a plurality of elevators 180 and a plurality of elevator control devices 140 for controlling the plurality of elevators 180, respectively. The elevator control device 140 controls the raising and lowering of the car of the elevator 180 to be controlled.
The guidance system 170 includes: a guidance device 150 that determines a destination floor to be introduced to an entrance person and uses an elevator 180; a virtual robot 100 that introduces the destination floor determined by the guidance device 150 and the elevator 180 to the person who enters the building; a robot controller 110 that controls the virtual robot 100; and a robot monitoring center 120 that accumulates data acquired by the virtual robot 100 via the robot control device 110 and analyzes the data. Further, the guidance system 170 includes: a monitoring camera control device 160 that performs face recognition based on a face image of an entrance person captured by a monitoring camera (not shown) provided in the building 190; and an attendance management device 130 that determines whether or not the person who has entered the building 190 and is identified by face recognition is a person who is working in the building 190, and updates the attendance state of the person who has entered the building based on the determination result. The robot controller 110, the guidance device 150, the robot monitoring center 120, the attendance management device 130, the monitoring camera controller 160, and the elevator controller 140 can communicate with each other via the communication network 195. At least 2 of the robot controller 110, the guide device 150, the robot monitoring center 120, the attendance management device 130, and the monitoring camera controller 160 may be integrated into one device, or at least one of the robot controller 110, the guide device 150, the robot monitoring center 120, the attendance management device 130, and the monitoring camera controller 160 may be divided into a plurality of devices.
Fig. 3 is a structural diagram of the virtual robot 100.
The virtual robot 100 is a computer having a function of performing a conversation with a human or a human interface device for performing a conversation with a human. The virtual robot 100 includes an interface unit 103, a storage unit 101, and a CPU104 connected to these units.
The interface unit 103 communicates with the robot controller 110.
The input/output device 102 is, for example, a monitor 102e that displays an image of a virtual robot, a human body sensor 102c and a camera 102a that recognize a human, and a microphone 102b and a speaker 102d that are used to perform a conversation with a human.
The storage unit 101 stores one or more programs that are executed by the CPU104 to realize the interactive control unit 101a and the input/output unit 101 b.
When a human being is recognized by the human body sensor 102c and the camera 102a, the conversation control unit 101a carries out a conversation with the human being through the microphone 102b and the speaker 102d under an instruction from the robot control device 110. The input/output unit 101b performs input/output of data between an input data processing unit 111a and an output data processing unit 111c, which will be described later, of the robot control device 110.
The virtual robot 100 acquires an image from the camera 102a, acquires a sound from the microphone 102b, and transmits the acquired image and sound to the robot controller 110. The virtual robot 100 speaks from the speaker 102d in accordance with an instruction from the robot control device 110. When a human being is detected by at least one of the human body sensor 102c and the camera 102a, the virtual robot 100 starts a conversation.
Fig. 4 is a configuration diagram of the robot controller 110.
The robot controller 110 includes an interface unit 112, a storage unit 111, and a CPU113 connected to these units.
The interface unit 112 communicates with the virtual robot 100 or with an external device via the communication network 195.
The storage unit 111 stores one or more programs that realize the input data processing unit 111a, the dialogue processing unit 111b, and the output data processing unit 111c when executed by the CPU 13.
The input data processing unit 111a processes data received from the virtual robot 100. The dialogue processing unit 111b determines the content of the next utterance (for example, a question) based on the input data and a question scene described later. The output data processing unit 111c transmits the utterance determined by the dialogue processing unit 111b to the virtual robot 100. The output data processing unit 111c transmits the contents of the dialog (one or more questions and one or more answers) to the guidance device 150.
The robot controller 110 may be integral with the virtual robot 100. That is, the virtual robot 100 stores a question scene described later, and the interactive control unit 101a of the virtual robot 100 can perform an interactive operation based on the answer of the person entering the room and the question scene. Alternatively, the dialog control unit 101a of the virtual robot 100 may perform a dialog in accordance with an instruction from the robot control device 110 (for example, a dialog may be performed in which a person speaks in accordance with the content of the utterance from the output data processing unit 111c of the robot control device 110 and a response to the utterance is transmitted to the robot control device 110).
Fig. 5 is a block diagram of the robot monitoring center 120.
The robot monitoring center 120 includes an interface unit 123, a storage unit 121, and a CPU124 connected to these units.
The interface unit 123 communicates with an external device via a communication network 195.
The storage unit 121 stores one or more programs that are executed by the CPU124 to realize the session storage unit 121a, the information processing unit 121b, and the destination floor indicator 122 c.
The dialogue storage unit 121a stores the problem scenes in the robot controller 110 in advance. The information processing unit 121b specifies the elevator 180 and the destination floor based on the information from the guidance device 150. The destination floor indicating part 122c indicates the determined destination floor to the elevator control device 140 of the determined elevator 180. In response to the instruction, the elevator control device 140 stops the car of the elevator 180 to be controlled at the instructed destination floor, and opens the destination floor and the doors of the car.
Fig. 6 is a configuration diagram of the attendance management apparatus 130.
The attendance management apparatus 130 includes an interface 132, a storage 131, and a CPU133 connected thereto.
The interface unit 132 communicates with an external device via a communication network 195.
The storage unit 131 stores one or more programs that are executed by the CPU133 to realize the entrance person determination unit 131a, the attendance management unit 131b, and the schedule management unit 131 c.
The entrance person determination unit 131a determines whether or not the face information of the entrance person in the building matches the face information of the worker who operates in the building 190, that is, determines the entrance person. The "face information of the entrance person in the building" is a face image of the entrance person captured by a monitoring camera (not shown) provided in the building 190 or information based on the face image (for example, information indicating a feature amount of the face image). For example, face information from the monitoring camera control device 160.
When the result of the attendance determination is true (that is, when the attendance is a worker working in the building 190), the attendance management unit 131b changes the attendance state of the attendance to attendance.
The schedule management unit 131c manages schedules for offices in the building 190.
Fig. 7 is a structural view of the guide device 150.
The guide device 150 includes an interface unit 152, a storage unit 151, and a CPU153 connected to these units. The guidance device 150 may be integrated with the attendance management unit 130.
The interface unit 152 communicates with an external device via the communication network 195.
The storage unit 151 stores one or more programs that realize the meeter specifying unit 151a, the attendance determination unit 151b, and the guide unit 151c when executed by the CPU 153.
The meeter specifying unit 115a specifies the meeter of the person who enters the house based on the contents of the conversation from the robot control device 110. Further, "determining a meeter of an attendee from the conversation content" refers to determining data that matches a meeting person attribute included in the conversation content (for example, an answer to a question of the meeting person attribute (for example, at least a name among a company name, an affiliation, and a name of the meeting person)) from at least a staff table of a staff table and a schedule table, which will be described later, for example.
The attendance determination unit 151b performs an attendance determination for determining whether or not the attendance state of the identified meeter is "attendance". The attendance determination may be, for example, an analysis of the answer received in response to the inquiry, inquiring the attendance management apparatus 130 as to whether the identified meeter is "attendance".
When the result of the attendance determination is true (that is, when the determined attendance state of the meeter is "attendance"), the guide unit 151c introduces the elevator 180 and the destination floor, which can transport the person who entered the hall to the destination floor, which is the floor of the meeter, to the person who entered the hall from the virtual robot 100 via the robot control device 110. That is, the virtual robot 100 makes a speech to introduce the elevator 180 and the destination floor to the entrance person. This enables the entrance person to know the elevator 180 to be used to see the meeter and the floor to which the meeting should arrive.
The "attendance status of the identified meeter is" attendance "can be identified from a staff table described later. The "destination floor" may be, for example, a floor at which a meeter is working (a floor specified from a staff table described later), a floor at which a meeting room in which a meeting is reserved (a floor specified from a calendar described later) specified from a calendar described later is provided, or a floor specified from a meeter in response to a later-described inquiry to the meeter (an inquiry to request a response to permit an entrance person to go to the destination floor). The "elevator capable of transporting the entering person to the destination floor" may be an elevator specified from an elevator table described later, or may be an elevator specified by an answer to an inquiry (an inquiry as to whether or not a stop at the destination floor) to each of the elevator control devices 140.
Fig. 8 is a configuration diagram of the monitoring camera control device 160.
The monitoring camera control device 160 includes an interface unit 162, a storage unit 161, and a CPU163 connected thereto.
The interface unit 162 communicates with an external device via the communication network 195.
The storage unit 161 stores one or more programs that are executed by the CPU163 to realize the face recognition unit 161a and the face information storage unit 161 b.
The face recognition unit 161a performs face recognition based on the face image of the person who enters the house input from the camera 102 a. The face information storage unit 161b stores face information of each worker in the building 190 or face information of each person who enters the building and whose attendance determination result by the attendance management apparatus 130 is false in advance. When face information of each worker in the building 190 is stored in advance, the monitoring camera control device 160 may have an entrance person determination unit 131a instead of the attendance management device 130.
Fig. 10 is a structural diagram of a building summary table.
The building summary sheet F1 is part of the problem scenario. The problem scenario is stored, for example, in the robot control device 110. The problem scenario may be distributed among more than 2 devices of the guidance system 170. The question scenario is composed of a plurality of detailed question scenarios and a summary question used for determining which of the plurality of detailed question scenarios should be used, and the building summary table F1 is a table corresponding to the summary question. The building summary table F1 stores information of the destination floor F1a and the answer content F1b, for example, per each answer (for example, per each category of area of the building 190) to a summary question (for example, "is a question about which area of a shop area, an office area, and a resident area. The destination floor F1a represents a floor in which the area corresponding to the answer to the summary question corresponds to the target area. The answer content F1b represents the content of an answer to the summary question. A detailed question scene (question table) corresponding to the answer content F1b is associated with each answer content F1 b.
Fig. 11 is a structural diagram of an office problem table.
The office question table F2 corresponds to a detailed question scenario for office use. The office question table F2 stores information of the question number F2a and the question content F2b for each question. Question number F2a indicates the number of the question. The question content F2b represents a question when the entrant is not a worker in the building 190. The detailed problem scenario is a problem scenario for specifying a floor (destination floor) desired by an entrance person from information of a meeter or the like.
Fig. 12 is a structural diagram of a calendar.
The schedule G1 is stored in any device of the guidance system 170, for example, the attendance management device 130. The schedule G1 stores information of the destination floor G1a, the conference room name G1b, and the time period G1c per reservation with respect to the office of the building 190. The destination floor G1a represents a floor where a conference room where a predetermined meeting is to be made is located. The conference room name G1b indicates the name of a conference room for a scheduled meeting. Time period G1c includes a predetermined time period for the meeting and a name of the meeter.
Fig. 13 is a configuration diagram of a store area table.
The store area table G2 is stored in any one of the guidance systems 170, for example, the robot controller 110. The store area table G2 stores information of the destination floor G2a, the area summary G2b, and the store name G2c for each floor related to a store area. The destination floor G2a represents a floor. The area summary G2b shows a summary of the store areas located on the corresponding floor. The store name G2c indicates a list of store names of stores located on the corresponding floor.
In the present embodiment, guidance of stores can also be performed. For example, guidance of the store may be started when an entrant who is not a staff member cannot determine meeter-related information (the meeter-related information is not associated with the entrant) or a store area is answered in a summary question. For example, the guidance device 150 announces information about the store in the building 190 to the entrant based on the detailed question scene for the store area and the store area table G2, identifies the store that the entrant desires based on the response of the entrant, and introduces the destination floor of the identified store and the use of an elevator to the entrant.
Fig. 14 is a structural diagram of a staff table.
The staff table G3 is stored in any device of the boot system 170, for example, the attendance management device 130. The staff table G3 stores information of a staff name G3a, a company name G3b, a work place residence G3c, and an attendance status G3d for each staff in the building 190. The worker name G3a represents the name of the worker. The company name G3b indicates the name of the company that the worker works. The work place residence G3c represents the residence of the place where the worker works. When the worker registers at a plurality of locations, the work place address G3c indicates a plurality of addresses corresponding to the plurality of locations, respectively, and a flag indicating which location among the plurality of locations the worker works. Therefore, when the worker is identified by the monitoring camera of another building different from the building 190, the flag is set with respect to the location corresponding to the other building. Attendance status G3d represents the attendance status of the worker (e.g., "attendance", "off duty"). The worker table G3 may store face information for each worker.
Fig. 15 is a structural view of an elevator meter.
The elevator table G4 is stored in any one of the devices of the guidance system 170, such as the guidance device 150. The elevator table G4 stores information of the elevator identifier G4a and the stoppable floor G4b for each elevator 180 in the building 190. The elevator identifier G4a represents an identifier of the elevator 180. The stoppable floor G4b represents a floor at which the car of the elevator 180 can be stopped.
An example of the processing performed in the present embodiment is described below.
Fig. 16 is a flowchart of the attendance status update process.
The images taken by the entrance person and the exit person of the building 190 are input from the monitoring camera to the monitoring camera control device 160. The face recognition unit 161a performs face recognition based on the face images of the photographed images of the entering person and the exiting person (S20). The face recognition unit 161a transmits information including face information as a result of the face recognition and which of the entry and exit is included to the attendance management apparatus 130.
In the attendance management apparatus 130, the entrance person determination unit 131a refers to the staff table G3, and determines which staff member the entrance person or the exit person is based on the information (face information and which of the entrance and the exit person) from the monitoring camera control apparatus 160 (S21). Specifically, the entering person determination unit 131a determines whether or not the face information from the monitoring camera control device 160 matches the face information of any one of the workers.
If the determination result in step S21 is true (yes in S21), the attendance management unit 131b updates the attendance status G3d corresponding to the matching staff member (S22). Specifically, when the worker is an entrance, the attendance management unit 131b updates the attendance state G3d corresponding to the worker to "attendance". When the worker is an exit worker, the attendance management unit 131b updates the attendance state G3d corresponding to the worker to "off duty".
In addition, regarding the staff having a plurality of workplaces, the attendance management unit 131b identifies buildings in which surveillance cameras that capture face images are present (for example, a surveillance camera control device is present for each building, and the building is identified from the surveillance camera control device that is the face information transmission source), and records a flag indicating "attendance" in the identified building in the staff table G3.
The input information of the entrance person may be biometric information such as fingerprint information instead of or in addition to face information, or information in an IC card such as an employee's card. However, according to the present embodiment, since the face image captured by the monitoring camera is used as the input information of the person who enters the facility, it is possible to determine whether or not the person who enters the facility is a worker even if the person who enters the facility does not perform any special operation, and if the person who enters the facility is determined to be a worker, the attendance state of the worker can be automatically updated. In addition, when the worker enters the building in this way, the attendance state becomes "attendance", and guidance to the person who enters the building and who is a meeter can be provided, so that the person who enters the building can receive guidance to the floor of the meeter by feeling that the meeter will not arrive at the work place.
Fig. 17 is a flowchart of the entrance guidance process. The entrance guidance process is started when the virtual robot 100 detects an entrance.
It is determined whether the person who enters the field detected by the virtual robot 100 is a worker (S31). This determination is performed by any of the following methods, for example.
The input/output unit 101b transmits the image captured by the camera 102a of the virtual robot 100 to the robot control device 110, and the robot control device 110 transmits the captured image to the monitoring camera control device 160. The face recognition unit 161a of the monitoring camera control device 160 performs face recognition of the face image in the captured image. The monitoring camera control device 160 or the attendance management device 130 determines whether or not the face information based on the face recognition result matches the face information of any one worker. When the determination result is true, the determination result of S31 is true.
The entrance person determination unit 131a of the attendance management apparatus 130 transmits face information of an entrance person whose entrance person determination result is false to the robot control apparatus 110. The robot controller 110 accumulates face information of an entering person whose result of the determination of the entering person is false. The virtual robot 100 or the robot controller 110 performs face recognition of a face image in the photographed image of the camera 102a of the virtual robot 100. The virtual robot 100 or the robot controller 110 performs face recognition of the face image in the photographed image. The virtual robot 100 or the robot controller 110 determines whether or not the face information based on the face recognition result matches the face information of any of the entrants whose result of the determination by the entrant is false. When the determination result is false, the determination result of S31 is true.
When the determination result at S31 is true (S31: yes), the person who entered guidance process ends, and when the determination result at S31 is false (S31: no), the process continues. That is, the target of the entrance guidance process can be reduced to an appropriate person such as an entrance who is not a worker of the building 190.
When the determination result at S31 is false (S31: no), the virtual robot 100 starts a conversation with the entrant person (S32). The conversation is performed according to the question scenario (for example, tables F1 and F2 shown in fig. 10 and 11). Under the control of the robot controller 110, a conversation is performed between the virtual robot 100 and the person entering the house. The contents of the conversation are transmitted from the robot controller 110 to the guidance device 150 at any time.
Each time the meeter specifying unit 151a of the guidance device 150 receives the contents of the session, it determines whether or not the contents of the session have a destination floor (S33). Note that, instead of S33, steps S34 and thereafter may be performed.
If the result of determination at S33 is true (S33: yes), the guidance unit 151c specifies the destination floor, specifies the elevator 180, i.e., the elevator 180, which can stop at the specified destination floor, from the elevator table G4, and notifies the robot monitoring center 120 and the guidance device 150 of the specified destination floor and the elevator 180. The destination floor indicating unit 122c of the robot monitoring center 120 indicates a destination floor to the elevator control device 140 corresponding to the use elevator 180. The guide unit 151c of the guide device 150 gives an introduction to the entrance person using the elevator 180 and the destination floor via the robot controller 110 and the virtual robot 100 (S40).
When the determination result at S33 is false (S33: NO), the conversation continues. The meeter specifying unit 151a of the guidance device 150 receives one or more answers including the name of the person who entered the entrance from the person who entered the entrance, and specifies the meeter of the person who entered the entrance based on the contents of the conversation (S34). For example, when the one or more answers include information of the attribute of the meeter such as the name of the meeter, and the name of the meeting room and the time period of the scheduled meeting, the meeter is determined according to at least one of the staff table G3 and the schedule table G1. The attendance determination unit 151b of the guidance device 150 acquires the attendance state of the identified meeter from the attendance management device 130 (S35), and determines whether the attendance state is "attendance" (S36).
If the determination result at S36 is true (yes at S36), guide 151c refers to staff table G3, and determines whether the identified meeter is in a different building from building 190 (S37).
When the determination result at S37 is false (S37: no), that is, when the meeter is present at the office of the building 190, the guidance unit 151c specifies the destination floor (the floor of the meeter), specifies the elevator 180, i.e., the elevator 180, which can stop at the specified destination floor, based on the elevator table G4, and notifies the robot monitoring center 120 and the guidance device 150 of the specified destination floor and the use of the elevator 180. The destination floor indicating part 122c of the robot monitoring center 120 indicates a destination floor to the elevator control device 140 corresponding to the use elevator 180. In this way, it can be expected that when the entering person arrives at the elevator 180, the car is already at the current floor, and that the button of the destination floor is already pressed even if the button is not operated in the car using the elevator 180. The guide unit 151c of the guide device 150 gives an introduction to the entrance person using the elevator 180 and the destination floor via the robot controller 110 and the virtual robot 100 (S40).
If the determination result at S37 is true (yes at S37), the guidance unit 151c introduces another building where the person is located to the entrance via the robot control device 110 and the virtual robot 100 (S38). For example, the virtual robot 100 displays the name of another building, the residence, and the path from the building 190 to the other building. This enables the entry person to quickly move to the correct meeting place.
If the determination result at S36 is false (no at S36), the guide unit 151c introduces the absence of the person to the entrance via the robot control device 110 and the virtual robot 100 (S39).
As described above, according to the present embodiment, the destination floor, which is the floor of the meeter, can be specified based on the contents of the conversation with the entrance who is not the worker of the building 190, and the entrance can be introduced from the virtual robot 100 using the elevator 180 and the destination floor.
Although one embodiment of the present invention has been described above, these are examples for illustrating the present invention, and the scope of the present invention is not intended to be limited to this embodiment. The invention can also be carried out in other various ways.
For example, the functions of the virtual robot 100 may be executed as an application program downloaded to a mobile terminal such as a smartphone of an entrance person. This makes it possible to expect to receive guidance from the virtual robot 100 without waiting in line for using the virtual robot 100.
For example, when introducing the use of the elevator 180 and the destination floor to the entrance person, the guidance unit 151c may output a message notifying that the entrance person has gone to the destination floor to the information processing terminal of the meeter (e.g., the address of the meeter specified from the staff table G3). This enables the meeter to prepare for the arrival of the person who enters the hall (for example, to meet the person who enters the hall before the elevator at the destination floor).
For example, the guidance unit 151c may output an inquiry to the information processing terminal of the meeter to request a response to permit the entrance to the destination floor before the entrance is introduced to the entrance using the elevator 180 and the destination floor. This makes it possible to expect a flexible guidance according to the desire of the meeter. For example, the guidance may be performed only when the answer to the inquiry indicates permission, or the meeter may change the destination floor and guide the changed destination floor.

Claims (5)

1. A guidance system, characterized in that,
the disclosed device is provided with:
a first entrance person determination unit that determines whether input information of an entrance person detected by a device other than a physical or virtual robot in a building matches information of a worker who works in the building, that is, a first entrance person determination unit;
an attendance management unit that changes, of attendance states of workers in the building, an attendance state of a person who is determined in the first person determination that input information matches information of the worker to attendance;
a second entrance person determination unit that determines whether or not input information of an entrance person detected by the robot matches information of a worker;
a dialogue control unit that performs a dialogue in which the robot presents one or more questions to an object person who is an entrance determined in the second entrance determination that the input information does not match the information of the worker, and receives one or more answers including an entrance name from the object person;
a meeter specifying unit that specifies a meeter of the target person who enters the room based on contents of the conversation;
an attendance determination unit that determines whether the determined attendance state of the meeter is attendance, that is, an attendance determination; and
and a guide unit that introduces the target person from the robot to an elevator, that is, an elevator and the destination floor, which are destination floors capable of transporting the target person to the meeting person, when the attendance state of the meeting person is attendance as a result of the attendance determination.
2. Guidance system according to claim 1,
when the elevator and the destination floor are introduced to the target entrance, the guidance unit outputs a message notifying the target entrance to travel to the destination floor to the information processing terminal of the meeter.
3. Guidance system according to claim 1,
the guidance unit outputs, to an information processing terminal of the meeter, a request for an answer to a question of whether or not the target entering person is permitted to go to the destination floor, before introducing the use elevator and the destination floor to the target entering person.
4. Guidance system according to claim 1,
the attendance management unit manages attendance status of a worker in a building in which 2 or more buildings are operated, the worker being present in the building,
when the attendance state of the meeter is attendance in a building different from the building into which the subject person enters, the guidance unit introduces the different building to the subject person.
5. Guide system according to any one of claims 1 to 4,
the different device is a monitoring camera installed in the building, and the input information of the person who enters the building detected by the different device is a face image of the person who enters the building captured by the monitoring camera or information based on the face image.
CN201811354581.3A 2017-12-11 2018-11-14 Guidance system Active CN109896365B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017237269A JP6991051B2 (en) 2017-12-11 2017-12-11 Guidance system
JP2017-237269 2017-12-11

Publications (2)

Publication Number Publication Date
CN109896365A CN109896365A (en) 2019-06-18
CN109896365B true CN109896365B (en) 2021-06-08

Family

ID=66943302

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811354581.3A Active CN109896365B (en) 2017-12-11 2018-11-14 Guidance system

Country Status (2)

Country Link
JP (1) JP6991051B2 (en)
CN (1) CN109896365B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111273833B (en) * 2020-03-25 2022-02-01 北京百度网讯科技有限公司 Man-machine interaction control method, device and system and electronic equipment
CN114851191B (en) * 2022-04-25 2024-03-26 北京云迹科技股份有限公司 Distribution robot control method and related equipment
JP7401011B1 (en) 2023-03-24 2023-12-19 フジテック株式会社 elevator control system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003340757A (en) * 2002-05-24 2003-12-02 Mitsubishi Heavy Ind Ltd Robot
JP2008189455A (en) * 2007-02-07 2008-08-21 Mitsubishi Electric Corp Elevator group managing device
CN101618542A (en) * 2009-07-24 2010-01-06 塔米智能科技(北京)有限公司 System and method for welcoming guest by intelligent robot
CN106429661A (en) * 2016-11-30 2017-02-22 上海贝思特电气有限公司 Fingerprint recognizing method and fingerprint recognizing elevator system
CN106829662A (en) * 2017-04-10 2017-06-13 安徽北菱电梯股份有限公司 A kind of multifunctional intellectual elevator device and control method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003030394A (en) * 2001-07-17 2003-01-31 Tokyu Biru Maintenance Kk Attending/leaving management system
JP4072033B2 (en) * 2002-09-24 2008-04-02 本田技研工業株式会社 Reception guidance robot device
JP2004312513A (en) * 2003-04-09 2004-11-04 Casio Comput Co Ltd Entrance management system and program
JP2007160440A (en) * 2005-12-12 2007-06-28 Honda Motor Co Ltd Control device of legged mobile robot
JP2010191565A (en) * 2009-02-17 2010-09-02 Brother Ind Ltd Device, method, and program for receiving visitor
JP5836670B2 (en) * 2011-07-07 2015-12-24 株式会社レイトロン Guidance system, photographing apparatus, and control apparatus
JP6820664B2 (en) * 2016-03-25 2021-01-27 本田技研工業株式会社 Reception system and reception method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003340757A (en) * 2002-05-24 2003-12-02 Mitsubishi Heavy Ind Ltd Robot
JP2008189455A (en) * 2007-02-07 2008-08-21 Mitsubishi Electric Corp Elevator group managing device
CN101618542A (en) * 2009-07-24 2010-01-06 塔米智能科技(北京)有限公司 System and method for welcoming guest by intelligent robot
CN106429661A (en) * 2016-11-30 2017-02-22 上海贝思特电气有限公司 Fingerprint recognizing method and fingerprint recognizing elevator system
CN106829662A (en) * 2017-04-10 2017-06-13 安徽北菱电梯股份有限公司 A kind of multifunctional intellectual elevator device and control method

Also Published As

Publication number Publication date
JP2019105951A (en) 2019-06-27
JP6991051B2 (en) 2022-01-12
CN109896365A (en) 2019-06-18

Similar Documents

Publication Publication Date Title
CN109896365B (en) Guidance system
EP3011521B1 (en) Mobile application based dispatching
US10358319B2 (en) Allocation of elevators in elevator systems based on internal database
CN109132742B (en) Elevator user guidance system
US8662256B2 (en) Elevator control apparatus with car stop destination floor registration device
CN109693980B (en) Elevator dispatching method, device and system
US7319966B2 (en) Method of communicating information for elevator users
US10259681B2 (en) Elevator dispatch using fingerprint recognition
JP2019031393A (en) Elevator control system, elevator control device and elevator control method
KR20180039147A (en) A mobile method and system for selectively allowing an elevator to be automatically treated to a destination identified by a tenant and identified by a tenant
JP6465829B2 (en) Elevator equipment
WO2021191981A1 (en) Elevator system
CN112839890A (en) Interface device, elevator system and method for controlling the display of a plurality of destination calls
JP6651601B1 (en) Elevator group control device
JP4475565B2 (en) Elevator group management system
WO2021192155A1 (en) Guidance device, guidance system, guidance method, and non-temporary computer-readable medium storing program
JP6896666B2 (en) Work management system and method
JP7177491B2 (en) Elevator system and elevator call registration method
CN116457296A (en) Method and apparatus for an elevator system
JP2012006711A (en) Group control system for elevator
JP7446858B2 (en) Response control device, response control method, and program
JP2023176655A (en) Elevator management apparatus and elevator management method
JP2019178006A (en) Elevator control device
JP7375847B2 (en) elevator control system
JP2023014522A (en) Information display system and information display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant