WO2019087251A1 - Elevator usage log output system, and elevator usage log output method - Google Patents

Elevator usage log output system, and elevator usage log output method Download PDF

Info

Publication number
WO2019087251A1
WO2019087251A1 PCT/JP2017/039131 JP2017039131W WO2019087251A1 WO 2019087251 A1 WO2019087251 A1 WO 2019087251A1 JP 2017039131 W JP2017039131 W JP 2017039131W WO 2019087251 A1 WO2019087251 A1 WO 2019087251A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
elevator
log output
information
boarding
Prior art date
Application number
PCT/JP2017/039131
Other languages
French (fr)
Japanese (ja)
Inventor
貴大 羽鳥
正康 藤原
章 小町
孝道 星野
訓 鳥谷部
加藤 学
藤野 篤哉
渉 鳥海
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to PCT/JP2017/039131 priority Critical patent/WO2019087251A1/en
Priority to CN201780095835.1A priority patent/CN111212802B/en
Priority to JP2019550004A priority patent/JP7005648B2/en
Publication of WO2019087251A1 publication Critical patent/WO2019087251A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B11/00Main component parts of lifts in, or associated with, buildings or other structures
    • B66B11/02Cages, i.e. cars
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B3/00Applications of devices for indicating or signalling operating conditions of elevators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B5/00Applications of checking, fault-correcting, or safety devices in elevators

Definitions

  • the present invention relates to an elevator usage log output system for accurately grasping movement information of a user using an elevator, and an elevator usage log output method.
  • a detector for detecting an event drawn in around the landing door is provided, and a history recording unit for recording the event drawn in by the detector is provided. Then, a detection number counting unit for counting the number of detected events by the history recording unit is provided, and the opening / closing of the landing door is controlled based on the number of detected events in a predetermined period. According to this, it is possible to prevent in advance the pinched event to the landing door and the drawn event to the door pocket.
  • a surveillance camera is provided in a car, and a camera image is transmitted to an elevator maintenance company at a predetermined time to confirm a user. According to this, it is possible to easily confirm the last leaving of the building and the night entering from the distant place, and it is possible to improve the security function of the building.
  • a video recording means for recording a video in a car taken by a surveillance camera is provided, and when a signal regarding an event with a high possibility of crime is received, the reception time
  • the image in the car of the predetermined time zone recorded in the image recording means is stored together with the data and the identification data representing each event. According to this, the recording capacity of the video in the car can be reduced, and the video at the time of occurrence of the event can be searched quickly.
  • An object of the present invention is to provide an elevator usage log output system capable of accurately grasping movement information of a user, and an elevator usage log output method.
  • the feature of the present invention is to set and give discrimination information and ID information for discriminating the user to each of the users from the image of the hall camera that photographs the elevator hall or the camera in the car that photographs the inside of the car.
  • the movement of the user is traced based on the discrimination information and the ID information, and at least the user's boarding floor and the getting-off floor are linked and stored, and the user's movement information is grasped.
  • FIG. 1 It is a block diagram which shows the outline of the elevator operation control system which becomes the 1st Embodiment of this invention, and a monitoring camera control system. It is the external view which looked at the landing from diagonally upper. It is an explanatory view showing the user detection range set up for every boarding number machine. It is an explanatory view explaining movement of the user of the boarding board from the fourth floor to the first floor. It is explanatory drawing which illustrates a user's movement and which shows the landing of the 5th floor before a user board
  • FIG. 1 is a block diagram showing an outline of a group management controlled elevator according to a first embodiment of the present invention.
  • the elevator operation management system 10 functioning as a group management control device is connected to boarding elevator control systems 11A to 11N that control boarding machines of a plurality of elevators, and the boarding elevator control systems 11A to 11N operate elevators It has a function of receiving a control command from the management system 10 and actually operating the boarding number. For example, it controls the electric motor of the hoisting machine of the boarding machine, the brake mechanism of the hoisting machine, the door opening and closing electric motor and the like.
  • the elevator operation management system 10 is connected to the hall elevator service request device 13, the building management system 14, the public organization management system 15, and the monitoring camera control system 16 via the communication network 12.
  • the landing elevator service request device 13, the building management system 14, and the public organization management system 15 are not related to the present embodiment, and thus detailed description will be omitted.
  • the system configuration shown in FIG. 1 is operated as a private system built for each building or for each of a plurality of buildings.
  • the elevator operation management system 10 includes a learning unit 10A, a receiving unit 10B, a floor-by-floor number evaluation unit 10C, a general evaluation unit 10D, and an allocation command unit 10E, each of which is a computer. It can be realized as a control function.
  • the reception unit 10B is connected to the communication network 12, and various related information from the monitoring camera control system 16 is input.
  • the received various pieces of related information are sent to the learning unit 10A, and learning of the various pieces of related information is performed by executing rewriting processing and the like.
  • the received various types of related information are sent to the floor-by-floor number-of-people evaluation unit 10C, where predetermined evaluation calculation processing is executed.
  • the evaluation calculation result is sent to the total evaluation unit 10D, and the total evaluation calculation is executed together with other evaluation calculation parameters.
  • the operation assignment information of the boarding machine calculated by the general evaluation unit 10D is sent to the allocation command unit 10E, and the allocation command unit 10E transmits a control command to the corresponding boarding machine elevator system 11A to 11N, and the boarding machine elevator system Have 11A to 11N execute a predetermined function.
  • the monitoring camera control system 16 which is a feature of the present embodiment is an image input processing unit 16A of a monitoring camera, a detection range setting processing unit 16B, a user detection processing unit 16C, an ID setting processing unit 16D, and users by floor.
  • a detection processing unit 16E, a movement information output processing unit 16F, and a log output processing unit 16G can also be realized as a control function of a computer.
  • the detection range setting processing unit 16B has a function of setting the user detection range of the user detected by the following user detection processing unit 16C, and can set the user detection range in an arbitrary range. For example, a semicircular detection range of a predetermined radius can be set or a rectangular detection range can be set in front of the landing door on the front of the boarding board of the landing.
  • the user detection processing unit 16C has a function of specifying and extracting the user from the image captured by the monitoring camera.
  • the user detection processing unit 16C has a function of storing discrimination information of the detected individual user, for example, an image feature amount of the individual user. This makes it possible to identify and identify the user who has moved to another floor.
  • the image feature amount for determining the user is described as the determination information. However, information other than the image feature amount that identifies the user is handled as the determination information.
  • the ID setting processing unit 16D has a function of setting and giving ID information to all the users detected by the user detection processing unit 16C. For example, if identification information of a user's face or body is associated with ID information, ID information can be traced by image analysis of the user. The identification information and the ID information of individual users can be linked by other methods. In addition, it is desirable to encrypt individual user identification information, ID information, etc. in order to ensure confidentiality.
  • condition for releasing the user and the ID linked can be set.
  • the release condition may change depending on a value fixed when the system is stored or a communication request of a system that cooperates with various types.
  • the floor-based user detection processing unit 16E detects the movement information of each user based on the identification information and the ID information of the user for each floor.
  • the movement information of the user detected here is, for example, the generation time of the ID information and its floor, the boarding number of the passenger, the boarding time of the user and its floor, based on the identification information of the user and the ID information Departure time of the person and its floor.
  • the movement information output processing unit 16F has a function of storing the user movement information for each floor obtained by the floor-based user detection processing unit 16E in a separately provided rewritable storage area so as to be able to log output. Have.
  • the log output processing unit 16G has a function of outputting a log of the movement information of the user stored in the movement information output processing unit 16F. Therefore, user's movement information that is output as a log can be effectively used as parameter data of a simulation in advance for performing group management control.
  • FIGS. 2 to 5 are views for explaining the concept of the present embodiment
  • FIGS. 6 to 11 are views for explaining the specific embodiments.
  • the landing of a certain floor for example, the fourth floor
  • the landing surveillance camera 17 is installed at an arbitrary position of the landing.
  • the landing surveillance camera 17 is a wide-angle camera capable of photographing the entire landing.
  • four elevators are installed at this landing, and boarding machines 4A to 4D are operated respectively.
  • the landing surveillance camera 17 is disposed on each floor other than this floor.
  • the landing surveillance camera 17 on each floor is configured as a network camera, and is comprehensively managed and controlled by the surveillance camera control system 16, whereby tracing of the user is enabled.
  • a user stands by in front of a boarding machine 4B operated upward and a boarding machine 4D operated downward.
  • a boarding machine 4B In the boarding machine 4B, four users Pdn going downwards stand by, and in the boarding machine 4D, six users Pup going upwards stand by.
  • FIG. 3 shows a user detection range where image analysis is performed to detect a user for each boarding number.
  • the user detection ranges 19A to 19D are set in a semicircular shape having a predetermined radius centering around the center of each of the landing doors 18A to 18D. Therefore, user detection ranges 19A to 19D are set for each of the boarding vehicles 4A to 4D.
  • the setting of the user detection range can be arbitrarily set by the image input processing unit 16A and the detection range setting processing unit 16B, and may be set to a rectangular shape or any other shape besides this, in addition to the semicircular shape. Is also possible.
  • image feature points that are identification information of the user are extracted by image analysis, and ID numbers are set so that they can be linked to the image feature points of this individual user , Will be granted. Therefore, if there is a user whose image feature point of the user photographed by another landing surveillance camera 17 matches, the movement trajectory of the user can be estimated from the ID number.
  • the user detection range and the boarding board can be determined from the posture and direction in which the user is facing by extracting feature points such as the user's face and shoulders by image analysis.
  • the moving trajectory of the user can be estimated from the temporal change of the image, the user detection range and its boarding car can be determined by determining to which boarding board the vehicle has moved.
  • elevator position coordinates are set in advance, or an elevator door is detected by an image captured from a landing, and the position coordinates are automatically set, and the position coordinates of the elevator are described above. As described above, it is possible to determine to which elevator position coordinate the user is facing based on the detected attitude and direction of the user, and to determine to which board the passenger is waiting.
  • the user Pdn-1 is detected in the user detection range 19D as a user who gets on the boarding vehicle 5D.
  • the user detection range can be determined by the same process even in the overlapping area of other user detection ranges.
  • the user existing in the overlapping area is also set and given an ID number linked to the image feature point which is the discrimination information.
  • the boarding time information can be acquired as the time when the user boarded the corresponding boarding number.
  • FIG. 4 shows the moving state of the user who gets on the boarding vehicle 4B shown in FIG. 3 and moves downward.
  • Hall surveillance cameras 17-4 to 17-1 are installed on each floor, and the image information of each hall surveillance camera 17 is sent to the surveillance camera control system 16 to be comprehensively managed and controlled.
  • "n" of reference numeral 17-n of the surveillance camera means a floor number.
  • the in-car monitoring camera 20 is installed in the car 21 of the boarding car 4B, and the image information of the in-car monitoring camera 20 is also sent to the monitoring camera control system 16 and comprehensively managed and controlled.
  • FIGS. 5A to 5E show the case where the car 21 moves upward, contrary to FIG.
  • the user waiting on the fifth floor waits for the arrival of the boarding vehicle 4B as shown in FIG. 5A.
  • the in-car monitoring camera 20 of the car 4B sends the images of the users Pa-Pd to the monitoring camera control system 16, and from this image, the users Pa-Pd Discrimination information is obtained and compared with the discrimination information of the user sent from the landing surveillance camera 17-5. If the comparison result is the same, it is recognized that all the passengers Pa to Pd of the hall have taken the boarding vehicle 4B. On the other hand, if the comparison result is different, it is recognized that some of the users who do not match move using the stairs or the like.
  • the user Pd gets off, but the image of the user Pd is displayed on the surveillance camera control system 16 by the landing surveillance camera 17-7.
  • the identification information of the user Pd is sent from this image, and compared with the identification information of the user Pd of the image sent from the landing surveillance camera 17-5. If the comparison result is the same, it is recognized that the user Pd has got off on the seventh floor, and the getting off floor, the getting off time, etc. are stored.
  • the accuracy of the user's recognition can be improved by comparing the discrimination information by the landing surveillance camera 17-7 with the discrimination information by the in-car surveillance camera 20 of the car.
  • the accuracy of recognition can be improved by executing the same process on the following floors.
  • the user Pc gets off as shown in FIG. 5E, but the image of the user Pc is displayed on the surveillance camera control system 16 by the landing surveillance camera 17-9.
  • the identification information of the user Pc is sent from this image, and is compared with the identification information of the user Pc of the image sent from the landing surveillance camera 17-5. If the comparison result is the same, it is assumed that the user Pc gets off on the ninth floor, the getting off floor, the getting off time, and the like are stored. In addition, since the user Pc disappears also from the image of the in-car monitoring camera 20 of the car 4B, it can be recognized that the user Pc gets off on the ninth floor.
  • the user Pe heading downwards newly appears at the landing on the 9th floor, as shown in FIG. 5F, the user Pe exists in the user detection range 19C and extracts the discrimination information of the user Pe.
  • the ID information is set and assigned, and at the same time, the set time of the ID number is measured and stored. After this, the movement of the user Pe is traced to acquire movement information.
  • FIG. 6 is a flowchart showing a computer-based control flow of the above-described process performed by the monitoring camera control system 10. This control flow is started at the start timing of each predetermined time.
  • Step S10 a user detection range setting process is performed, and user detection ranges 19A to 19D as shown in FIG. 3 are set based on the input landing image. Then, from the images in the user detection ranges 19A to 19D, the extraction of the user for each boarding vehicle is performed in the following step S11. When the user detection range setting process is completed, the process proceeds to step S11.
  • step S11 the user detection process is executed, the images in the user detection ranges 19A to 19D set in step S10 are analyzed, and the extraction of individual users for each boarding board is executed. It is In the extraction of individual users, an image feature amount for extracting a user is obtained from an image captured by the landing surveillance camera 17, and an individual user is specified and extracted from the feature amount. Since this feature quantity is used as discrimination information of individual users, it is configured to be stored in a storage area (not shown).
  • the user detection range 19A to 19D when the image in the user detection range 19A to 19D is analyzed, if the user exists in the overlapping area where the user detection range overlaps, the user detection range is set as the user. There are times when it can not be judged. Therefore, when the user detection process ends, the process proceeds to step S12, and the setting process of the user present in the overlapping area is performed.
  • step S12 resetting of the user detection range of the user present in the overlapping area, here, processing called human vector detection is performed.
  • This processing is to determine and reset which user detection range the user Pdn-1 (refer to FIG. 3) present in the overlapping area belongs to, and the image analysis allows the user's face, shoulder, etc.
  • the user detection range and its boarding car can be determined from the direction and posture in which the user is facing by extracting the feature points of.
  • the movement locus of the user Pdn-1 can be estimated from the temporal change of the image, the user detection range and the boarding vehicle can be determined by determining which boarding vehicle has moved to. can do.
  • the user detection range of the user present in the overlapping area can be set based on at least one piece of information on the posture and direction of the user, or the movement trajectory.
  • the human vector detection process will be described in detail with reference to FIG. When the human vector detection process is completed, the process proceeds to step S13.
  • step S13 ID numbers are set and assigned to the individual discrimination information of all the users extracted in steps S11 and S12. By linking identification information including feature amounts of the user's face and body with ID information, it is possible to trace ID information of the moving user by image analysis of the user. In step S13, the ID number associated with the identification information of the individual user is stored.
  • an ID number of "0001" is set to the user Pa as shown in FIG. 4, an ID number of "0002” is set to the user Pb, and an ID number of "0003” to the user Pc.
  • the user Pd can be set and the ID number "0004" can be set. Therefore, for example, when the user Pa moves from a certain departure floor to a different arrival floor, the discrimination information of the user Pa is acquired in the departure floor, so the image of the user Pa in the arrival floor If the same discrimination information is obtained in the analysis, the user Pa can be traced by the ID number "0001".
  • the ID assignment process will be described in detail with reference to FIG.
  • the identification information and the ID information of individual users can be linked by other methods. In addition, it is desirable to encrypt individual user identification information, ID information, etc. in order to ensure confidentiality.
  • the process proceeds to step S14.
  • step S14 an individual user reaches the landing user detection range, sets an ID number, and stores the assigned time (occurrence time). For this reason, the occurrence time is different for each ID number, and for example, it is possible to acquire the departure time distribution of the user for each floor and the like.
  • step S15 it is detected whether or not the user has got in the boarding number which has arrived.
  • this detection method it can be detected based on the disappearance of the user from the image. That is, when the user disappears from the user detection range, it is determined that the user has got on the boarding number.
  • the identification information of the user and the ID information are acquired, it is possible to obtain the individual user's boarding information.
  • step S16 instead of judging from the image of the landing surveillance camera 17, it is also possible to obtain the user's boarding information from the image of the in-car surveillance camera 20 provided in the car 21.
  • the use by using the standby at the landing by comparing with the identification information of the user by the image of the in-car surveillance camera 20 It can be determined whether a person has boarded.
  • the process proceeds to step S16.
  • step S16 the time (boarding time) when the user boarded the boarding number is stored. Also in this case, since the identification information of the user and the ID number are acquired, as shown in step S15, it is possible to acquire the time when the user has boarded from the boarding information detected for each user. When the boarding time is acquired, the process proceeds to step S17.
  • step S17 when the boarding number on which the user has boarded reaches a desired getting-off floor, the getting-off information of the user is detected.
  • the user's getting off detection can be detected by the landing surveillance camera 17 on the arrival floor. That is, although the user alights on the arrival floor, the landing surveillance camera 17 on the arrival floor sends the image of the user to the surveillance camera control system 16.
  • the surveillance camera control system 16 the discrimination information of the user who got off is obtained from the sent image, and compared with the discrimination information of the user of the image sent from the landing surveillance camera 17 on the departure floor. If the comparison result is the same, it is recognized that the user getting on from the departure floor has got off, and the ID number, the getting off floor, etc. are stored. In addition, since the user disappears also from the image of the in-car monitoring camera 20, it can be estimated that the user has dismounted.
  • the detection processing of the getting-off user is completed, the process proceeds to step S18.
  • step S18 the time when the user got off the boarding machine (alighting time) is stored. Also in this case, since the identification information of the user and the ID number are acquired, as shown in step S17, it is possible to acquire the time when the user got off from the alighting information detected for each user. When the getting-off time is acquired, the process proceeds to step S19.
  • Step S19 a log storage area in which the user's movement information can be rewritten so that the individual user's movement information (movement history) acquired by executing each control step described above can be log-outputted. It is memorized by.
  • “user identification information”, "ID number” associated with this, "occurrence time (set time)” of the ID number, and “occurrence floor number” where the ID number is generated, The “ride time” and “ride floor number” on which the passenger board was taken, and the “disembarking time” and “land floor number off” on which the passenger board got off are stored.
  • information on the user detection range and the boarding car is also stored, it can be stored in the log storage area as needed.
  • Step S20 In step S20, the ID number of the individual user and the determination information linked to it are released (erased), and only the substantial movement information of the user is left as a history.
  • step S20 the control flow of FIG. 6 is ended, and the process waits for the next start timing.
  • step S12 shown in FIG. 6 a specific example of step S12 shown in FIG. 6 will be described with reference to FIG.
  • steps S30 to S34 shown in FIG. 7 are executed, and the process returns to step S13 again.
  • Step S30 it is determined from the image of the camera whether a user is newly detected at the landing. Since image analysis enables identification of individual users, when a new user appears on the landing for a user before the current time, this new user can be identified by extracting this. In step S30, if a new user is not extracted, the process goes to the end, and if a new user is extracted, the process proceeds to step S31.
  • step S31 As shown in FIG. 3, it is determined whether a new user Pdn-1 exists in the overlapping area of the user detection range 19D and the user detection range 19C. Then, if it is determined that a new user Pdn-1 exists in the overlapping area, the process proceeds to step S32, and if it is determined that a new user Pdn-1 does not exist in the overlapping area, step S33. Migrate to
  • step S32 the user detection range to which the new user belongs is determined by the human vector processing (step S12) shown in FIG. 6, and the boarding car is also determined. This is digitized as a position coordinate of the detected user and an angle based on an arbitrary direction with respect to the direction in which the user is facing. For example, in the case where the user Pdn-1 in FIG. 3 is taken as an example, in the layout viewed from above the landing plane, the angle is perpendicular to the upper side in the figure, and is 0 °. From there, it is detected from which elevator the user is facing which elevator it is facing.
  • the user detection range is reset by setting the waiting direction of each boarding vehicle, detecting the waiting direction of the user of the overlapping area from the image of the monitoring camera, and comparing with the waiting direction of the boarding vehicle.
  • the user detection range of the user existing in the overlapping area can be set.
  • Step S33 Since it is determined in step S31 that the new user clearly belongs to the user detection range, a boarding vehicle is determined in step S33 based on the user detection range.
  • Step S34 the boarding number is assigned to the user of the overlapping area and the user extracted in the user detection range. On the other hand, even when a new user appears and does not exist in the overlapping area but exists in the user detection range, the boarding number is assigned to the user extracted in the user detection range.
  • the process proceeds to step S13 of FIG. 6, and the operation of the control flow shown in FIG. 6 is continued.
  • step S13 shown in FIG. 6 a specific example of step S13 shown in FIG. 6 will be described with reference to FIG.
  • steps S40 to S41 shown in FIG. 8 are executed, and the process returns to step S14 again.
  • Step S40 it is determined whether the discrimination information of the user present in the user detection range is detected for the first time.
  • the discrimination information of the user existing in the user detection range before the current time is already stored in the storage area of the monitoring camera control system 16, the discrimination information stored this time and the discrimination detected this time By comparing the information, it can be determined whether the user is detected for the first time.
  • step S41 If the discrimination information is not detected for the first time, the process goes to the end. If the discrimination information is detected for the first time, the process proceeds to step S41.
  • step S41 an ID number is newly assigned to the user identification information detected for the first time. Then, since the identification information of the user and the ID number are linked and stored, it is possible to trace the movement state of the user who has been given the ID number. When this process is completed, the process proceeds to step S14 of FIG. 6, and the operation of the control flow shown in FIG. 6 is continued.
  • step S20 shown in FIG. 6 Next, a specific example of step S20 shown in FIG. 6 will be described with reference to FIG.
  • steps S50 to S51 shown in FIG. 9 are executed to shift to the end.
  • Step S50 it is determined whether or not the release process of the ID number is necessary.
  • the case where the movement information (movement history) of the user is traced over a predetermined time and the case where it is not so are set, and as an example, they are set in “one day”. Then, when the release state condition is set as "one day unit", the process proceeds to step S51, and when it is not set, the process proceeds to step S52.
  • step S51 the identification information (feature amount) and the ID number are not associated with each other until the date and time becomes 0:00:00, but are associated with each other and held. This makes it possible to trace the movement of the user over the time of day.
  • Step S52 when the predetermined release condition is satisfied within the time less than "on a daily basis", the ID number assigned to the individual user and the discrimination information associated with this are released (erased) Be done. As described above, what is released is the ID number assigned to the individual user and the discrimination information linked to this, and the "occurrence time (set time)" of the other ID numbers. , “Exit floor number where ID number is generated”, “ride time when you boarded the boarding number” and “boarding floor number”, “disembarking time” when you got off the board number and “dismounting floor number” Is held as parameter data of
  • the release condition set in step S52 (1) the release condition that the user got off the boarding vehicle is (2) time such as one round time in the traffic calculation used in the installation plan of the elevator (3) Assuming that the time expiration used is a release condition, assuming that there is a first elevator group and a second elevator group, assuming the connection, the first elevator group to the second elevator group or the second elevator group Release condition when transferring from the first group of elevators to the first group of elevators (for example, when there is a transfer from the lower layer to the upper layer, the release condition is when the higher elevator is dismounted) It is possible to set an appropriate release condition because it can be considered.
  • FIG. 10 shows a specific example of the log tabulation process executed in step S19, and is a log output when the case of getting off the boarding vehicle is regarded as the ID release condition.
  • the log output includes “user identification information” representing the user, “ID number” linked to it, “occurrence time” of the ID number, and “occurrence floor number” at which the ID number is generated.
  • a “ride time” and “ride floor number” on which the ride number boarded, and “disembarkation time” and “dismount floor number” off the ride number machine are stored.
  • the ID number is set to "0001”, and this ID number is set to "5:00” and "8:00:01”, and the user Pa is "5th floor”. It is possible to obtain information that the passenger boarding board “8:01:05” at the boarding point and that the user Pa has got off the boarding board at “eighth floor” from “boarding board” to “8:01:20”. The same applies to other users, and it is possible to improve the accuracy of the prior simulation for performing group management control from the movement information of these users.
  • the user's identification information and the "ID number" associated with it are deleted, and general user information (for example, A, B, C, etc.) is deleted. Will be replaced. Since the simulation does not require the user's personal information, there is no problem if the personal information such as the user's discrimination information is deleted, and it is also desirable from the viewpoint of protection of the personal information.
  • FIG. 11 shows a specific example of the log aggregation process executed in step S19, and is a log output when “one day unit” is an ID release condition.
  • the log output is basically the same as in FIG. 10, but since the ID release condition is “on a day” basis, movement information is acquired on a “day basis” as shown by the user Pc.
  • the ID number is set to "0003" for the user Pc, and this ID number is set to "8:00:04" on the “5th floor”, and the user Pc is Board the boarding number at 8:01:06 on the 5th floor and further, the user Pc gets off from the boarding platform on the 9th floor at 8:01:55. You can get information that you went to the office.
  • the ID number of the user Pc is held as "0003", and this ID number is "9th floor” and "10:
  • the user Pc is detected again at 10:25 “, and the user Pc takes" 10:10:36 "to the boarding machine at the" 9th floor “platform, and the user Pc also operates the" 1st floor "platform You can get the information that you got off at "10:11:03" from the boarding vehicle.
  • by setting the release condition of the ID it is possible to create data for maintaining the simulation accuracy in any usage scene in consideration of privacy.
  • the floor number, the boarding time on which the boarding machine was boarded, the boarding floor number, and the boarding time on which the boarding board was dismounted are stored. It is also possible to set other storage items.
  • the above-described embodiment shows an example in which the surveillance camera control system 16 comprehensively manages and controls the images of the hall surveillance camera 17 provided at the landing of each floor and the in-car surveillance camera 20 provided in the car.
  • the user's movement information may be acquired using only the landing monitoring camera 17 provided at the landing of each floor, or the user using only the car monitoring camera 20 provided in the car. You may acquire movement information of
  • the boarding user detection process of step S15 and the getting-off user detection process of step S17 detect the behavior of the user from the image of the boarding surveillance camera 17 on board and the image of the boarding surveillance camera 17 aboard Can be implemented by That is, when getting on the vehicle, it can be detected when the user disappears from the user detection range, and when getting off, it can be detected when the user appears in the user detection range.
  • the identification information of the individual user detected by the in-car monitoring camera 20 is associated with the ID number
  • the time of appearance when the user appears in the car and the time of disappearance when the user has disappeared from the car are stored for each ID number. Then, the departure floor and the arrival floor can be linked from the board number information and the time information for each ID number, and the log output can be acquired.
  • the identification information and ID information for identifying the user are set and given to each of the users from the image of the hall camera for photographing the elevator hall or the in-car camera for photographing the inside of the car
  • the movement of the user is traced based on the discrimination information and the ID information, and at least the user's boarding floor and the getting-off floor are linked and stored, and the user's movement information is grasped.
  • the present invention is not limited to the embodiments described above, but includes various modifications.
  • the embodiments described above are described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described.
  • one means of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
  • SYMBOLS 10 Elevator operation control system, 10A ... Learning part, 10B ... Reception part, 10C ... Number-of-people evaluation part according to a floor, 10D ... Comprehensive evaluation part, 10E ... Allocation command part, 11A-11N ... Ride number elevator control system, 12 ... Communication network 16 Monitoring camera control system 16A Image input processing unit 16B Detection range setting processing unit 16C User detection processing unit 16D ID setting processing unit 16E Floor-specific user detection processing unit , 16F: movement information output processing unit, 16G: log output processing unit, 17: landing surveillance camera, 18A to 18D: landing door, 19A to 19D: user detection range, 20: in-car surveillance camera.

Landscapes

  • Engineering & Computer Science (AREA)
  • Civil Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Structural Engineering (AREA)
  • Indicating And Signalling Devices For Elevators (AREA)
  • Elevator Control (AREA)

Abstract

The present invention provides an elevator usage log output system that can accurately understand movement information of users, and an elevator usage log output method. ID information and distinguishing information for distinguishing a user are set and assigned to each passenger Pa-Pd on the basis of an image from a camera 17 that photographs an elevator landing, or from a camera 20 that photographs the inside of an elevator car, the movement of users Pa-Pd is traced on the basis of this distinguishing information and ID information, and at least the boarding floor and disembarking floor of the users are associated and stored, and movement information of users is accurately understood. As a result, movement information of a user can be accurately understood, and therefore, for example, the accuracy of advance simulations for group management control can be improved.

Description

エレベーター利用ログ出力システム、及びエレベーター利用ログ出力方法Elevator usage log output system and elevator usage log output method
 本発明はエレベーターを利用する利用者の移動情報を正確に把握するエレベーター利用ログ出力システム、及びエレベーター利用ログ出力方法に関するものである。 The present invention relates to an elevator usage log output system for accurately grasping movement information of a user using an elevator, and an elevator usage log output method.
 比較的規模の大きな建築物においては、エレベーターによる利用者輸送能力を向上させるため複数のエレベーターを併設し、乗り場での呼び登録に際して、最適な乗りかごを選択してサービスさせるシステムを導入している。更に、建築物の規模が大きくなるにつれ、併設されるエレベーターの台数も多くなり、これら複数のエレベーターを群管理制御装置により、適切に制御して利用者に対する待ち時間の低減等のサービス向上を図っている。 For relatively large-scale buildings, we have installed multiple elevators to improve the user transport capacity by elevator, and introduced a system to select and serve the most suitable car at the time of call registration at the platform. . Furthermore, as the scale of the building becomes larger, the number of elevators provided side by side increases, and these group elevators are appropriately controlled by the group management control device to improve services such as waiting time for users. ing.
 このような群管理制御装置を用いたエレベーターにおいては、それぞれのエレベーターを適切に制御するため、利用者の行動等を計測してエレベーターの制御を行なうようにしている。例えば、特開2010-254391号公報(特許文献1)、特開2003-221174号公報(特許文献2)、特開2006-21852号公報(特許文献3)等においては、以下のようなシステムが提案されている。 In an elevator using such a group management control device, in order to control each elevator appropriately, the user's behavior or the like is measured to control the elevator. For example, in JP 2010-254391 (PT 1), JP 2003-221174 (PT 2), JP 2006-21852 (PT 3), etc., the following systems are available. Proposed.
 特許文献1に示されたエレベーターシステムにおいては、乗場戸の周辺に引き込まれ事象を検出する検出器が設けられ、また、この検出器による引き込まれ事象を記録する履歴記録部が設けられている。そして、履歴記録部による事象検出数を計数する検出数集計部が備えられ、所定期間における事象検出数に基づいて乗場戸の開閉を制御している。これによれば、乗場戸への挟まれ事象や戸袋への引き込まれ事象を未然に防ぐことができる。 In the elevator system shown in Patent Document 1, a detector for detecting an event drawn in around the landing door is provided, and a history recording unit for recording the event drawn in by the detector is provided. Then, a detection number counting unit for counting the number of detected events by the history recording unit is provided, and the opening / closing of the landing door is controlled based on the number of detected events in a predetermined period. According to this, it is possible to prevent in advance the pinched event to the landing door and the drawn event to the door pocket.
 また、特許文献2に示されたエレベーターシステムにおいては、乗りかご内に監視カメラを設け、所定時刻にカメラ画像をエレベーター保守会社に送信して利用者の確認を行なうようにしている。これによれば、ビルの最終退館者や夜間入館者を離れた場所から容易に確認することができ、ビルのセキュリティ機能を向上することができる。 Further, in the elevator system disclosed in Patent Document 2, a surveillance camera is provided in a car, and a camera image is transmitted to an elevator maintenance company at a predetermined time to confirm a user. According to this, it is possible to easily confirm the last leaving of the building and the night entering from the distant place, and it is possible to improve the security function of the building.
 更に、特許文献3に示されたエレベーターシステムにおいては、監視カメラで撮影された乗りかご内の映像を記録する映像記録手段を設け、犯罪の可能性が高い事象に関する信号を受信したとき、受信時刻データ及び各事象を表す識別データとともに、映像記録手段に記録されている所定時間帯の乗りかご内の映像を保存するにしている。これによれば、乗りかご内の映像の記録容量を節減し、事象発生時の映像を迅速に検索できる。 Furthermore, in the elevator system shown in Patent Document 3, a video recording means for recording a video in a car taken by a surveillance camera is provided, and when a signal regarding an event with a high possibility of crime is received, the reception time The image in the car of the predetermined time zone recorded in the image recording means is stored together with the data and the identification data representing each event. According to this, the recording capacity of the video in the car can be reduced, and the video at the time of occurrence of the event can be searched quickly.
特開2010-254391号公報Unexamined-Japanese-Patent No. 2010-254391 特開2003-221174号公報JP 2003-221174 A 特開2006- 21852号公報Unexamined-Japanese-Patent No. 2006-21852
 ところで、この種の群管理制御装置を備えたエレベーターにおいては、エレベーターを効率的に運用するため、事前にシミュレーションによる模擬運転を行なうことが往々にして要求される。このためには、利用者の移動情報(建築物の上方向、或いは下方向への移動、利用者の人数、乗車号機、待機時刻、出発階床、到着階床等)を正確に把握する必要がある。このように利用者の移動情報が正確に把握できると、群管理制御を行なうための事前のシミュレーションに必要となる正確なパラメータデータが得られるので、シミュレーションの精度を向上できる。したがって、群管理制御装置においては、利用者の移動情報を正確に把握することが強く要請されている。 By the way, in an elevator equipped with this kind of group management control device, in order to operate the elevator efficiently, it is often required to perform simulated operation by simulation in advance. For this purpose, it is necessary to accurately grasp the user's movement information (moving upward or downward of the building, the number of users, boarding number, waiting time, departure floor, arrival floor, etc.) There is. As described above, if the movement information of the user can be accurately grasped, accurate parameter data required for the simulation in advance for performing group management control can be obtained, so that the accuracy of the simulation can be improved. Therefore, in the group management control apparatus, it is strongly demanded that the user's movement information be accurately grasped.
 本発明の目的は、利用者の移動情報を正確に把握することができるエレベーター利用ログ出力システム、及びエレベーター利用ログ出力方法を提供することにある。 An object of the present invention is to provide an elevator usage log output system capable of accurately grasping movement information of a user, and an elevator usage log output method.
 本発明の特徴は、エレベーターの乗場を撮影する乗場カメラ、或いは乗りかご内を撮影するかご内カメラの画像から、利用者のそれぞれに利用者を判別する判別情報とID情報を設定、付与し、この判別情報とID情報を基に利用者の移動をトレースして少なくとも利用者の乗車階床と降車階床を紐つけて記憶させて利用者の移動情報を把握する、ところにある。 The feature of the present invention is to set and give discrimination information and ID information for discriminating the user to each of the users from the image of the hall camera that photographs the elevator hall or the camera in the car that photographs the inside of the car The movement of the user is traced based on the discrimination information and the ID information, and at least the user's boarding floor and the getting-off floor are linked and stored, and the user's movement information is grasped.
 本発明によれば、利用者の移動情報を正確に把握できるので、例えば、群管理制御を行なうための事前のシミュレーションに必要となる正確なパラメータデータが得られるので、シミュレーションの精度を向上できる。 According to the present invention, since movement information of the user can be accurately grasped, for example, accurate parameter data required for a previous simulation for performing group management control can be obtained, so that the simulation accuracy can be improved.
本発明の第1の実施形態になるエレベーター運行管理システムと監視カメラ制御システムの概略を示す構成図である。BRIEF DESCRIPTION OF THE DRAWINGS It is a block diagram which shows the outline of the elevator operation control system which becomes the 1st Embodiment of this invention, and a monitoring camera control system. 乗場を斜め上方から見た外観図である。It is the external view which looked at the landing from diagonally upper. 乗車号機毎に設定される利用者検出範囲を示す説明図である。It is an explanatory view showing the user detection range set up for every boarding number machine. 4階から1階までの乗車号機の利用者の移動を説明する説明図である。It is an explanatory view explaining movement of the user of the boarding board from the fourth floor to the first floor. 利用者の移動を説明するもので、利用者が乗車号機に乗り込む前の5階の乗場を示す説明図である。It is explanatory drawing which illustrates a user's movement and which shows the landing of the 5th floor before a user board | plates a boarding board. 利用者の移動を説明するもので、利用者が乗車号機に乗り込んだ後の5階の乗場を示す説明図である。It is explanatory drawing which illustrates a user's movement, and shows the landing on the 5th floor after a user boarded in a boarding number machine. 利用者の移動を説明するもので、1人の利用者が降車した7階の乗場を示す説明図である。It is explanatory drawing which illustrates a user's movement and which shows the boarding point of the 7th floor which one user got off. 利用者の移動を説明するもので、2人の利用者が降車した8階の乗場を示す説明図である。It is explanatory drawing which illustrates a user's movement, and shows the landing of the 8th floor where two users got off. 利用者の移動を説明するもので、1人の利用者が降車した9階の乗場を示す説明図である。It is an explanatory view for explaining the movement of the user, and showing the landing on the 9th floor where one user got off. 1人の新たな利用者が乗車する前の9階の乗場を示す説明図である。It is an explanatory view showing the landing on the ninth floor before one new user gets on. 本発明の実施形態になるエレベーターの利用者の移動情報を取得する方法を説明するフローチャートである。It is a flowchart explaining the method of acquiring the movement information of the user of the elevator which becomes embodiment of this invention. 図6に示す制御ステップS12の詳細な制御フローである。It is a detailed control flow of control step S12 shown in FIG. 図6に示す制御ステップS13の詳細な制御フローである。It is a detailed control flow of control step S13 shown in FIG. 図6に示す制御ステップS20の詳細な制御フローである。It is a detailed control flow of control step S20 shown in FIG. 図5に示す制御ステップS19で実行されるログ出力の具体的なデータを示す図である。It is a figure which shows the concrete data of the log output performed by control step S19 shown in FIG. 図5に示す制御ステップS19で実行されるログ出力の他の具体的なデータを示す図である。It is a figure which shows the other concrete data of the log output performed by control step S19 shown in FIG.
 次に、本発明の実施形態について図面を用いて詳細に説明するが、本発明は以下の実施形態に限定されることなく、本発明の技術的な概念の中で種々の変形例や応用例をもその範囲に含むものである。 Next, embodiments of the present invention will be described in detail with reference to the drawings. However, the present invention is not limited to the following embodiments, and various modifications and applications may be made within the technical concept of the present invention. Is included in the range.
 図1は、本発明の第1の施形態になる群管理制御されるエレベーターの概略を示す構成図である。 FIG. 1 is a block diagram showing an outline of a group management controlled elevator according to a first embodiment of the present invention.
 群管理制御装置として機能するエレベーター運行管理システム10は、複数台のエレベーターのそれぞれの乗車号機を制御する乗車号機エレベーター制御システム11A~11Nと接続され、乗車号機エレベーター制御システム11A~11Nは、エレベーター運行管理システム10からの制御指令を受信して乗車号機の運行を実際に行なう機能を備えている。例えば、乗車号機の巻上機の電動モータ、巻上機のブレーキ機構、戸開閉電動モータ等を制御するものである。 The elevator operation management system 10 functioning as a group management control device is connected to boarding elevator control systems 11A to 11N that control boarding machines of a plurality of elevators, and the boarding elevator control systems 11A to 11N operate elevators It has a function of receiving a control command from the management system 10 and actually operating the boarding number. For example, it controls the electric motor of the hoisting machine of the boarding machine, the brake mechanism of the hoisting machine, the door opening and closing electric motor and the like.
 また、エレベーター運行管理システム10は、通信網12を介して乗場エレベーターサービス要求装置13、ビル管理システム14、公共機関管理システム15、及び監視カメラ制御システム16と接続されている。尚、乗場エレベーターサービス要求装置13、ビル管理システム14、公共機関管理システム15は、本実施形態と関係しないので、詳細な説明は省略する。また、図1に示すシステム構成は、建築物毎、或いは複数の建築物毎に構築されたプライベートシステムとして運用されるものである。 The elevator operation management system 10 is connected to the hall elevator service request device 13, the building management system 14, the public organization management system 15, and the monitoring camera control system 16 via the communication network 12. The landing elevator service request device 13, the building management system 14, and the public organization management system 15 are not related to the present embodiment, and thus detailed description will be omitted. Moreover, the system configuration shown in FIG. 1 is operated as a private system built for each building or for each of a plurality of buildings.
 本実施形態になるエレベーター運行管理システム10は、学習部10A、受信部10B、階床別人数評価部10C、総合評価部10D、及び割当て指令部10Eとから構成されており、それぞれは、コンピュータの制御機能として実現することができる。 The elevator operation management system 10 according to the present embodiment includes a learning unit 10A, a receiving unit 10B, a floor-by-floor number evaluation unit 10C, a general evaluation unit 10D, and an allocation command unit 10E, each of which is a computer. It can be realized as a control function.
 受信部10Bは、通信網12と接続されており、監視カメラ制御システム16からの各種関連情報が入力されている。受信された各種関連情報は、学習部10Aに送られて書き換え処理等の実行によって各種関連情報の学習が行われる。 The reception unit 10B is connected to the communication network 12, and various related information from the monitoring camera control system 16 is input. The received various pieces of related information are sent to the learning unit 10A, and learning of the various pieces of related information is performed by executing rewriting processing and the like.
 また、受信された各種関連情報は、階床別人数評価部10Cに送られ、ここで所定の評価演算処理が実行される。この評価演算結果は総合評価部10Dに送られ、他の評価演算パラメータと併せて総合評価演算が実行される。総合評価部10Dで演算された乗車号機の運行割当て情報は割当て指令部10Eに送られ、割当て指令部10Eは、対応する乗車号機エレベーターシステム11A~11Nに制御指令を送信して、乗車号機エレベーターシステム11A~11Nに所定の機能を実行させる。 Further, the received various types of related information are sent to the floor-by-floor number-of-people evaluation unit 10C, where predetermined evaluation calculation processing is executed. The evaluation calculation result is sent to the total evaluation unit 10D, and the total evaluation calculation is executed together with other evaluation calculation parameters. The operation assignment information of the boarding machine calculated by the general evaluation unit 10D is sent to the allocation command unit 10E, and the allocation command unit 10E transmits a control command to the corresponding boarding machine elevator system 11A to 11N, and the boarding machine elevator system Have 11A to 11N execute a predetermined function.
 一方、本実施形態の特徴である監視カメラ制御システム16は、監視カメラの画像入力処理部16A、検出範囲設定処理部16B、利用者検出処理部16C、ID設定処理部16D、階床別利用者検出処理部16E、移動情報出力処理部16F、及びログ出力処理部16Gから構成されており、これらも、コンピュータの制御機能として実現することができる。 On the other hand, the monitoring camera control system 16 which is a feature of the present embodiment is an image input processing unit 16A of a monitoring camera, a detection range setting processing unit 16B, a user detection processing unit 16C, an ID setting processing unit 16D, and users by floor. A detection processing unit 16E, a movement information output processing unit 16F, and a log output processing unit 16G can also be realized as a control function of a computer.
 検出範囲設定処理部16Bは、以下の利用者検出処理部16Cで検出する利用者の利用者検出範囲を設定する機能を備えており、任意の範囲で利用者検出範囲を設定することができる。例えば、乗場の乗車号機の前面の乗場戸の前で、所定半径の半円状の検出範囲を設定したり、矩形の検出範囲を設定することができる。 The detection range setting processing unit 16B has a function of setting the user detection range of the user detected by the following user detection processing unit 16C, and can set the user detection range in an arbitrary range. For example, a semicircular detection range of a predetermined radius can be set or a rectangular detection range can be set in front of the landing door on the front of the boarding board of the landing.
 利用者検出処理部16Cは、画像入力処理部16Aに入力された、監視カメラによって撮影された画像から、人物(=利用者)検出のための画像特徴量を求め、この特徴量から利用者を特定、抽出する機能を備えている。或いは、人物の頭部像や全体像のモデルと、撮影された画像とを比較して利用者を特定、抽出する機能を備えている。 The user detection processing unit 16C obtains an image feature amount for detecting a person (= user) from the image captured by the monitoring camera input to the image input processing unit 16A, and uses the feature amount to determine the user. It has a function to identify and extract. Alternatively, it has a function of identifying and extracting a user by comparing a head image or an overall image model of a person with a captured image.
 いずれにしても、利用者検出処理部16Cは、監視カメラによって撮影された画像から、利用者を特定、抽出する機能を備えている。また、利用者検出処理部16Cは、検出された個別の利用者の判別情報、例えば個別の利用者の画像特徴量を記憶する機能を備えている。これによって、他の階床に移動した利用者を識別して特定することができる。以下では、利用者を判別する画像特徴量を判別情報として説明するが、画像特徴量以外にも利用者を識別する情報であれば、判別情報として取り扱うものとする。 In any case, the user detection processing unit 16C has a function of specifying and extracting the user from the image captured by the monitoring camera. In addition, the user detection processing unit 16C has a function of storing discrimination information of the detected individual user, for example, an image feature amount of the individual user. This makes it possible to identify and identify the user who has moved to another floor. In the following, the image feature amount for determining the user is described as the determination information. However, information other than the image feature amount that identifies the user is handled as the determination information.
 ID設定処理部16Dは、利用者検出処理部16Cで検出された全ての利用者に対してID情報を設定、付与する機能を備えている。例えば、利用者の顔や身体の判別情報とID情報を紐つけておけば、利用者の画像解析によってID情報をトレースすることが可能となる。尚、これ以外の方法によっても個別の利用者の判別情報とID情報を紐つけることは可能である。また、個別の利用者の判別情報、ID情報等は秘匿性を確保するため、暗号化処理しておくことが望ましい。 The ID setting processing unit 16D has a function of setting and giving ID information to all the users detected by the user detection processing unit 16C. For example, if identification information of a user's face or body is associated with ID information, ID information can be traced by image analysis of the user. The identification information and the ID information of individual users can be linked by other methods. In addition, it is desirable to encrypt individual user identification information, ID information, etc. in order to ensure confidentiality.
 更に紐付けた利用者とIDを解放する条件も設定可能とする。これらは、当該システムを納めた際に固定される値や、或いは、各種連携するシステムの通信要求によって、解放条件が変化するものであっても良い。利用者とIDを紐づけるためには利用者の特徴点をコンピュータ内に格納する必要があるため当該システムの利用シーンにおいてはより、秘匿性を考慮する必要があるためである。 Furthermore, the condition for releasing the user and the ID linked can be set. In these cases, the release condition may change depending on a value fixed when the system is stored or a communication request of a system that cooperates with various types. In order to associate the user with the ID, it is necessary to store the feature points of the user in the computer, and therefore it is necessary to consider the confidentiality more in the usage scene of the system.
 階床別利用者検出処理部16Eは、各階床毎に利用者の判別情報とID情報に基づいてそれぞれの利用者の移動情報を検出する。ここで検出する利用者の移動情報は、例えば、利用者の判別情報、ID情報を基に、ID情報の発生時刻及びその階床、乗車号機番号、利用者の乗車時刻及びその階床、利用者の降車時刻及びその階床等である。 The floor-based user detection processing unit 16E detects the movement information of each user based on the identification information and the ID information of the user for each floor. The movement information of the user detected here is, for example, the generation time of the ID information and its floor, the boarding number of the passenger, the boarding time of the user and its floor, based on the identification information of the user and the ID information Departure time of the person and its floor.
 このような移動情報の取得は、利用者の画像解析による判別情報及びID情報の設定、付与によって可能となるもので、各階床の監視カメラや乗りかご内の監視カメラによってトレースすることができ、これによって上述した利用者の移動情報を正確に求めることができる。 Acquisition of such movement information is possible by setting and giving discrimination information and ID information by image analysis of the user, which can be traced by a surveillance camera on each floor or a surveillance camera in a car, This makes it possible to accurately obtain the user's movement information described above.
 移動情報出力処理部16Fは、階床別利用者検出処理部16Eで得られた階床毎の利用者の移動情報をログ出力できるように、別に設けた書き換え可能な記憶エリアに記憶させる機能を備えている。 The movement information output processing unit 16F has a function of storing the user movement information for each floor obtained by the floor-based user detection processing unit 16E in a separately provided rewritable storage area so as to be able to log output. Have.
 ログ出力処理部16Gは、移動情報出力処理部16Fで記憶された利用者の移動情報をログ出力させる機能を備えている。したがって、ログ出力される利用者の移動情報は、群管理制御を行なうための事前のシミュレーションのパラメータデータとして有効に利用することができる。 The log output processing unit 16G has a function of outputting a log of the movement information of the user stored in the movement information output processing unit 16F. Therefore, user's movement information that is output as a log can be effectively used as parameter data of a simulation in advance for performing group management control.
 次に、本実施形態の特徴となる監視カメラ制御システム16の詳細を図2~図11を用いて説明する。図2~図5は、本実施形態の考え方を説明する図であり、図6~図11はその具体的な実施形態を説明する図である。 Next, details of the monitoring camera control system 16 which is a feature of the present embodiment will be described with reference to FIGS. FIGS. 2 to 5 are views for explaining the concept of the present embodiment, and FIGS. 6 to 11 are views for explaining the specific embodiments.
 図2においては、或る階床(例えば、4階)の乗場を示しており、乗場の任意の位置に乗場監視カメラ17が設置されている。この乗場監視カメラ17は、乗場の全体を撮影できる広角カメラが用いられている。また、この乗場には4台のエレベーターが設置されており、それぞれ乗車号機4A~4Dが運行されている。 In FIG. 2, the landing of a certain floor (for example, the fourth floor) is shown, and the landing surveillance camera 17 is installed at an arbitrary position of the landing. The landing surveillance camera 17 is a wide-angle camera capable of photographing the entire landing. In addition, four elevators are installed at this landing, and boarding machines 4A to 4D are operated respectively.
 尚、この階床以外の各階床にも乗場監視カメラ17が配置されていることはいうまでもない。そして、各階床の乗場監視カメラ17はネットワークカメラとして構成されており、監視カメラ制御システム16で総合的に管理、制御されるものであり、これによって、利用者のトレースが可能とされている。そして、乗場には上方向へ運行される乗車号機4Bと、下方向へ運行される乗車号機4Dの前に、利用者が待機している。乗車号機4Bには下方向に向かう4名の利用者Pdnが待機し、乗車号機4Dには上方向に向かう6名の利用者Pupが待機している。 It goes without saying that the landing surveillance camera 17 is disposed on each floor other than this floor. The landing surveillance camera 17 on each floor is configured as a network camera, and is comprehensively managed and controlled by the surveillance camera control system 16, whereby tracing of the user is enabled. At the landing, a user stands by in front of a boarding machine 4B operated upward and a boarding machine 4D operated downward. In the boarding machine 4B, four users Pdn going downwards stand by, and in the boarding machine 4D, six users Pup going upwards stand by.
 図3は、乗車号機毎の利用者を検出するための画像解析を行なう利用者検出範囲を示している。各乗車号機4A~4Dの乗場戸18A~18Dの前で、乗場戸18A~18Dのそれぞれの中央付近を中心とする所定半径の半円形状に利用者検出範囲19A~19Dが設定されている。したがって、乗車号機4A~4D毎に利用者検出範囲19A~19Dが設定されることになる。 FIG. 3 shows a user detection range where image analysis is performed to detect a user for each boarding number. In front of the landing doors 18A to 18D of the boarding machines 4A to 4D, the user detection ranges 19A to 19D are set in a semicircular shape having a predetermined radius centering around the center of each of the landing doors 18A to 18D. Therefore, user detection ranges 19A to 19D are set for each of the boarding vehicles 4A to 4D.
 現時点では乗車号機4Bと乗車号機4Dに利用者が待機しているので、画像解析によって、利用者検出範囲19B、及び利用者検出範囲19Dに存在する利用者を検出することができる。この利用者検出範囲の設定は、画像入力処理部16A及び検出範囲設定処理部16Bで任意に設定可能であり、半円形状とは別に矩形の形状、或いはこれ以外の任意の形状に設定することも可能である。 Since the user stands by at the boarding board 4B and the boarding board 4D at present, it is possible to detect the user present in the user detection range 19B and the user detection range 19D by image analysis. The setting of the user detection range can be arbitrarily set by the image input processing unit 16A and the detection range setting processing unit 16B, and may be set to a rectangular shape or any other shape besides this, in addition to the semicircular shape. Is also possible.
 利用者検出範囲で検出された全ての利用者は、画像解析によって利用者の判別情報である画像特徴点が抽出され、この個別の利用者の画像特徴点に紐付けられるようにID番号が設定、付与される。したがって、別の乗場監視カメラ17で撮影された利用者の画像特徴点が一致する利用者がいれば、ID番号から利用者の移動軌跡を推定することができる。 For all users detected in the user detection range, image feature points that are identification information of the user are extracted by image analysis, and ID numbers are set so that they can be linked to the image feature points of this individual user , Will be granted. Therefore, if there is a user whose image feature point of the user photographed by another landing surveillance camera 17 matches, the movement trajectory of the user can be estimated from the ID number.
 ここで、隣接する利用者検出範囲19A~19Dが重なる重複領域に位置する利用者Pdn-1については、どの利用者検出範囲に属するかが判断できない状態が発生する。この場合は、画像解析によって、利用者の顔、肩等の特徴点を抽出して利用者が向き合っている姿勢や方向から利用者検出範囲、及びその乗車号機を決定することができる。或いは、画像の時間的変化から利用者の移動軌跡が推定できるので、どの乗車号機に向かって移動したかを判定することで、利用者検出範囲、及びその乗車号機を決定することができる。 Here, with respect to the user Pdn-1 located in the overlapping area where the adjacent user detection areas 19A to 19D overlap, it is not possible to determine which user detection area belongs to. In this case, the user detection range and the boarding board can be determined from the posture and direction in which the user is facing by extracting feature points such as the user's face and shoulders by image analysis. Alternatively, since the moving trajectory of the user can be estimated from the temporal change of the image, the user detection range and its boarding car can be determined by determining to which boarding board the vehicle has moved.
 また、エレベーターを待っている状態でも、エレベーターの位置座標を事前に設定、或いは乗場から撮像した画像によってエレベータードアを検出し、自動でその位置座標を設定し、エレベーターの位置座標に対して、上述のとおり、検出した利用者の姿勢や方向からどのエレベーター位置座標に向かっているか否かを判別し、どの乗車号機に向かって待機しているか判別することが可能となる。 In addition, even when waiting for an elevator, elevator position coordinates are set in advance, or an elevator door is detected by an image captured from a landing, and the position coordinates are automatically set, and the position coordinates of the elevator are described above. As described above, it is possible to determine to which elevator position coordinate the user is facing based on the detected attitude and direction of the user, and to determine to which board the passenger is waiting.
 したがって、図3においては、利用者Pdn-1は乗車号機5Dに乗車する利用者として、利用者検出範囲19Dで検出される。他の利用者検出範囲の重複領域でも同様の処理によって利用者検出範囲を決定することができる。当然ことながら、重複領域に存在する利用者も、判別情報である画像特徴点に紐つけられるID番号が設定、付与されることはいうまでもない。 Therefore, in FIG. 3, the user Pdn-1 is detected in the user detection range 19D as a user who gets on the boarding vehicle 5D. The user detection range can be determined by the same process even in the overlapping area of other user detection ranges. As a matter of course, it is needless to say that the user existing in the overlapping area is also set and given an ID number linked to the image feature point which is the discrimination information.
 このような状態で、個別の利用者の判別情報である画像特徴点、ID番号、ID番号が付与された時刻、乗車階床、乗車号機等の移動情報を取得することができる。更には、利用者が利用者検出範囲から消失すると、利用者が該当の乗車号機に乗車した時刻として乗車時刻情報を取得できる。 In such a state, it is possible to acquire movement information such as an image feature point which is identification information of an individual user, an ID number, a time when the ID number is given, a boarding floor, a boarding machine and the like. Furthermore, when the user disappears from the user detection range, the boarding time information can be acquired as the time when the user boarded the corresponding boarding number.
 図4は、図3に示す乗車号機4Bに乗車して下方に移動する利用者の移動状態を示している。各階床には乗場監視カメラ17-4~17-1が設置されており、各乗場監視カメラ17の画像情報は監視カメラ制御システム16に送られて総合的に管理、制御される。尚、監視カメラの参照番号17-nの「n」は階床番号を意味している。同様に乗車号機4Bにも、乗りかご21にかご内監視カメラ20が設置されており、これの画像情報も監視カメラ制御システム16に送られて総合的に管理、制御される。 FIG. 4 shows the moving state of the user who gets on the boarding vehicle 4B shown in FIG. 3 and moves downward. Hall surveillance cameras 17-4 to 17-1 are installed on each floor, and the image information of each hall surveillance camera 17 is sent to the surveillance camera control system 16 to be comprehensively managed and controlled. Incidentally, "n" of reference numeral 17-n of the surveillance camera means a floor number. Similarly, the in-car monitoring camera 20 is installed in the car 21 of the boarding car 4B, and the image information of the in-car monitoring camera 20 is also sent to the monitoring camera control system 16 and comprehensively managed and controlled.
 次に図5A~図5Eを基に利用者の移動状態を説明するが、図5A~図5Eは、図4とは反対に乗りかご21が上方に移動する場合を示している。 Next, the moving state of the user will be described based on FIGS. 5A to 5E, but FIGS. 5A to 5E show the case where the car 21 moves upward, contrary to FIG.
 まず、5階に待機している利用者は、図5Aにある通り乗車号機4Bの到着を待機している。この場合、利用者は利用者Pa~Pdの4名である。したがって、このとき乗場監視カメラ17-5からの画像を解析することによって、利用者検出範囲19B内の利用者Pa~Pdの個別の利用者の判別情報を抽出する。更に、この利用者Pa~Pdの判別情報に紐つけるようにID番号を設定、付与し、同時にID番号の設定時刻が計測されて記憶される。そして、乗車号機4Bが到着すると、図5Bにあるように利用者Pa~Pdは乗車号機4Bに乗り込む。この時、利用者検出範囲19Bから利用者Pa~Pdが消失すると、利用者Pa~Pdが乗車号機4Bに乗車したと判断して、乗車階床と乗車時刻を計測して記憶する。 First, the user waiting on the fifth floor waits for the arrival of the boarding vehicle 4B as shown in FIG. 5A. In this case, there are four users Pa to Pd. Therefore, by analyzing the image from the landing surveillance camera 17-5 at this time, discrimination information of individual users Pa to Pd in the user detection range 19B is extracted. Further, an ID number is set and assigned so as to be linked to the discrimination information of the users Pa to Pd, and at the same time, the set time of the ID number is measured and stored. Then, when the boarding number machine 4B arrives, as shown in FIG. 5B, the users Pa to Pd board the boarding number machine 4B. At this time, when the users Pa to Pd disappear from the user detection range 19B, it is determined that the users Pa to Pd have got on the boarding machine 4B, and the boarding floor and the boarding time are measured and stored.
 利用者Pa~Pdが乗車号機4Bに乗車すると、乗りかご4Bのかご内監視カメラ20によって、利用者Pa~Pdの画像が監視カメラ制御システム16に送られ、この画像から利用者Pa~Pdの判別情報が求められ、乗場監視カメラ17-5から送られてきた利用者の判別情報と比較される。比較結果が同じであれば、乗場の利用者Pa~Pdの全員が乗車号機4Bに乗車したと認識される。一方、比較結果が異なれば、一致しない利用者の一部が階段等を利用して移動したと認識される。 When the users Pa-Pd get on the boarding board 4B, the in-car monitoring camera 20 of the car 4B sends the images of the users Pa-Pd to the monitoring camera control system 16, and from this image, the users Pa-Pd Discrimination information is obtained and compared with the discrimination information of the user sent from the landing surveillance camera 17-5. If the comparison result is the same, it is recognized that all the passengers Pa to Pd of the hall have taken the boarding vehicle 4B. On the other hand, if the comparison result is different, it is recognized that some of the users who do not match move using the stairs or the like.
 次に、乗車号機4Bが上昇して7階で停車すると、図5Cにある通り、利用者Pdが降車するが、乗場監視カメラ17-7によって、利用者Pdの画像が監視カメラ制御システム16に送られ、この画像から利用者Pdの判別情報が求められて、乗場監視カメラ17-5から送られてきた画像の利用者Pdの判別情報と比較される。比較結果が同じであれば、7階で利用者Pdが降車したと認識して、降車階、及び降車時刻等が記憶される。 Next, when the boarding number 4B ascends and stops on the seventh floor, as shown in FIG. 5C, the user Pd gets off, but the image of the user Pd is displayed on the surveillance camera control system 16 by the landing surveillance camera 17-7. The identification information of the user Pd is sent from this image, and compared with the identification information of the user Pd of the image sent from the landing surveillance camera 17-5. If the comparison result is the same, it is recognized that the user Pd has got off on the seventh floor, and the getting off floor, the getting off time, etc. are stored.
 尚、乗りかご4Bのかご内監視カメラ20の画像からも利用者Pdが消失するので、利用者Pdが7階で降車したと認識することができる。したがって、乗場監視カメラ17-7による判別情報と、乗りかごのかご内監視カメラ20による判別情報を突き合わせることで、利用者の認識の確度を向上することができる。もちろん、以下の階床でも同様の処理を実行すれば、認識の確度を向上することができることはいうまでもない。 Since the user Pd disappears also from the image of the in-car monitoring camera 20 of the car 4B, it can be recognized that the user Pd gets off on the seventh floor. Therefore, the accuracy of the user's recognition can be improved by comparing the discrimination information by the landing surveillance camera 17-7 with the discrimination information by the in-car surveillance camera 20 of the car. Of course, it is needless to say that the accuracy of recognition can be improved by executing the same process on the following floors.
 次に、乗車号機4Bが上昇して8階で停車すると、図5Dにある通り、利用者Pa、Pbが降車するが、乗場監視カメラ17-8によって、利用者Pa、Pbの画像が監視カメラ制御システム16に送られ、この画像から利用者Pa、Pbの判別情報が求められて、乗場監視カメラ17-5から送られてきた画像の利用者Pa、Pbの判別情報と比較される。比較結果が同じであれば、8階で利用者Pa、Pbが降車したとして、降車階、及び降車時刻等が記憶される。尚、乗りかご4Bのかご内監視カメラ20の画像からも利用者Pa、Pbが消失するので、利用者Pa、Pbが8階で降車したと認識することができる。 Next, when the boarding number 4B ascends and stops on the eighth floor, as shown in FIG. 5D, the users Pa and Pb get off, but the image of the users Pa and Pb is monitored by the landing surveillance camera 17-8. Discrimination information of the users Pa and Pb is sent to the control system 16 from this image, and compared with discrimination information of the users Pa and Pb of the image sent from the landing surveillance camera 17-5. If the comparison result is the same, it is assumed that the users Pa and Pb get off on the eighth floor, the getting off floor, the getting off time, and the like are stored. Since the users Pa and Pb also disappear from the image of the in-car monitoring camera 20 of the car 4B, it can be recognized that the users Pa and Pb have dismounted on the eighth floor.
 次に、乗車号機4Bが上昇して9階で停車すると、図5Eにある通り、利用者Pcが降車するが、乗場監視カメラ17-9によって、利用者Pcの画像が監視カメラ制御システム16に送られ、この画像から利用者Pcの判別情報が求められて、乗場監視カメラ17-5から送られてきた画像の利用者Pcの判別情報と比較される。比較結果が同じであれば、9階で利用者Pcが降車したとして、降車階、及び降車時刻等が記憶される。尚、乗りかご4Bのかご内監視カメラ20の画像からも利用者Pcが消失するので、利用者Pcが9階で降車したと認識することができる。 Next, when the boarding number 4B ascends and stops on the ninth floor, the user Pc gets off as shown in FIG. 5E, but the image of the user Pc is displayed on the surveillance camera control system 16 by the landing surveillance camera 17-9. The identification information of the user Pc is sent from this image, and is compared with the identification information of the user Pc of the image sent from the landing surveillance camera 17-5. If the comparison result is the same, it is assumed that the user Pc gets off on the ninth floor, the getting off floor, the getting off time, and the like are stored. In addition, since the user Pc disappears also from the image of the in-car monitoring camera 20 of the car 4B, it can be recognized that the user Pc gets off on the ninth floor.
 次に9階の乗場に新たに下方向に向かう利用者Peが出現した場合は、図5Fにある通り、利用者Peは利用者検出範囲19Cに存在して、利用者Peの判別情報の抽出とID情報の設定、付与が行われ、同時にID番号の設定時刻が計測されて記憶される。これ以降は、利用者Peの移動がトレースされ、移動情報が取得されることになる。 Next, when the user Pe heading downwards newly appears at the landing on the 9th floor, as shown in FIG. 5F, the user Pe exists in the user detection range 19C and extracts the discrimination information of the user Pe. The ID information is set and assigned, and at the same time, the set time of the ID number is measured and stored. After this, the movement of the user Pe is traced to acquire movement information.
 このような考え方に基づき、次に具体的な実施形態を説明する。図6は、監視カメラ制御システム10で実行される上述した処理のコンピュータによる制御フローを示すフローチャートである。この制御フローは所定時間毎の起動タイミングで起動されるものである。 Based on such a concept, a specific embodiment will now be described. FIG. 6 is a flowchart showing a computer-based control flow of the above-described process performed by the monitoring camera control system 10. This control flow is started at the start timing of each predetermined time.
 ≪ステップS10≫
ステップS10においては、利用者検出範囲の設定処理を実行するものであり、入力された乗場の画像を基に、図3に示したような利用者検出範囲19A~19Dを設定する。そして、この利用者検出範囲19A~19Dの中の画像から、以下のステップS11によって乗車号機毎の利用者の抽出が実行されることになる。利用者検出範囲の設定処理が完了するとステップS11に移行する。
«Step S10»
In step S10, a user detection range setting process is performed, and user detection ranges 19A to 19D as shown in FIG. 3 are set based on the input landing image. Then, from the images in the user detection ranges 19A to 19D, the extraction of the user for each boarding vehicle is performed in the following step S11. When the user detection range setting process is completed, the process proceeds to step S11.
 ≪ステップS11≫
ステップS11においては、利用者の検出処理を実行するものであり、ステップS10で設定された利用者検出範囲19A~19Dの中の画像を解析し、乗車号機毎の個別の利用者の抽出を実行するものである。個別の利用者の抽出は、乗場監視カメラ17によって撮影された画像から、利用者を抽出するための画像特徴量を求め、この特徴量から個別の利用者を特定、抽出している。この特徴量は個別の利用者の判別情報として使用されるので、図示しない記憶エリアに記憶される構成となっている。
<< step S11 >>
In step S11, the user detection process is executed, the images in the user detection ranges 19A to 19D set in step S10 are analyzed, and the extraction of individual users for each boarding board is executed. It is In the extraction of individual users, an image feature amount for extracting a user is obtained from an image captured by the landing surveillance camera 17, and an individual user is specified and extracted from the feature amount. Since this feature quantity is used as discrimination information of individual users, it is configured to be stored in a storage area (not shown).
 ここで、利用者検出範囲19A~19Dの中の画像を解析した時に、利用者検出範囲が重複している重複領域に利用者が存在する場合では、どちらの利用者検出範囲の利用者に設定するか判断できない場合がある。したがって、利用者検出処理が終了するとステップS12に移行して重複領域に存在する利用者の設定処理を実行する。 Here, when the image in the user detection range 19A to 19D is analyzed, if the user exists in the overlapping area where the user detection range overlaps, the user detection range is set as the user. There are times when it can not be judged. Therefore, when the user detection process ends, the process proceeds to step S12, and the setting process of the user present in the overlapping area is performed.
 ≪ステップS12≫
ステップS12においては、重複領域に存在する利用者の利用者検出範囲の再設定、ここでは人ベクトル検出と称する処理を実行する。この処理は、重複領域に存在する利用者Pdn-1(図3参照)がどちらの利用者検出範囲に属するか判定して再設定するものであり、画像解析によって、利用者の顔、肩等の特徴点を抽出して利用者が向き合っている方向や姿勢から利用者検出範囲、及びその乗車号機を決定することができる。尚、この他に画像の時間的変化から利用者Pdn-1の移動軌跡が推定できるので、どの乗車号機に向かって移動したかを判定することで、利用者検出範囲、及びその乗車号機を決定することができる。このように、少なくとも利用者の姿勢や方向、或いは移動軌跡の1つ以上の情報から、重複領域に存在する利用者の利用者検出範囲を設定することができる。尚、この人ベクトル検出処理は図7で詳細に説明する。人ベクトル検出処理が終了するとステップS13に移行する。
<< step S12 >>
In step S12, resetting of the user detection range of the user present in the overlapping area, here, processing called human vector detection is performed. This processing is to determine and reset which user detection range the user Pdn-1 (refer to FIG. 3) present in the overlapping area belongs to, and the image analysis allows the user's face, shoulder, etc. The user detection range and its boarding car can be determined from the direction and posture in which the user is facing by extracting the feature points of. In addition, since the movement locus of the user Pdn-1 can be estimated from the temporal change of the image, the user detection range and the boarding vehicle can be determined by determining which boarding vehicle has moved to. can do. As described above, the user detection range of the user present in the overlapping area can be set based on at least one piece of information on the posture and direction of the user, or the movement trajectory. The human vector detection process will be described in detail with reference to FIG. When the human vector detection process is completed, the process proceeds to step S13.
 ≪ステップS13≫
ステップS13においては、ステップS11、12で抽出された全ての利用者の個別の判別情報にID番号を設定、付与している。利用者の顔や身体の特徴量からなる判別情報とID情報を紐つけておけば、利用者の画像解析によって、移動する利用者のID情報をトレースすることが可能となる。このステップS13においては、個別の利用者の判別情報に紐つけられたID番号が記憶される。
<< step S13 >>
In step S13, ID numbers are set and assigned to the individual discrimination information of all the users extracted in steps S11 and S12. By linking identification information including feature amounts of the user's face and body with ID information, it is possible to trace ID information of the moving user by image analysis of the user. In step S13, the ID number associated with the identification information of the individual user is stored.
 例えば、図4に示すような利用者Paには「0001」のID番号を設定し、利用者Pbには「0002」のID番号を設定し、利用者Pcには「0003」のID番号を設定し、利用者Pdには「0004」のID番号を設定することができる。したがって、例えば、利用者Paが、或る出発階床から異なる到着階床に移動した場合、出発階床で利用者Paの判別情報が取得されているので、到着階床で利用者Paの画像解析で同じ判別情報が得られれば、利用者PaはID番号「0001」によってトレースできることとなる。このID付与処理は図8で詳細に説明する。 For example, an ID number of "0001" is set to the user Pa as shown in FIG. 4, an ID number of "0002" is set to the user Pb, and an ID number of "0003" to the user Pc. The user Pd can be set and the ID number "0004" can be set. Therefore, for example, when the user Pa moves from a certain departure floor to a different arrival floor, the discrimination information of the user Pa is acquired in the departure floor, so the image of the user Pa in the arrival floor If the same discrimination information is obtained in the analysis, the user Pa can be traced by the ID number "0001". The ID assignment process will be described in detail with reference to FIG.
 尚、これ以外の方法によっても個別の利用者の判別情報とID情報を紐つけることは可能である。また、個別の利用者の判別情報、ID情報等は秘匿性を確保するため、暗号化処理しておくことが望ましい。ID番号の付与処理が完了するとステップS14に移行する。 The identification information and the ID information of individual users can be linked by other methods. In addition, it is desirable to encrypt individual user identification information, ID information, etc. in order to ensure confidentiality. When the ID number assignment process is completed, the process proceeds to step S14.
 ≪ステップS14≫
ステップS14においては、個別の利用者が乗場の利用者検出範囲に到達してID番号を設定、付与された時刻(発生時刻)を記憶する。このため、ID番号別に発生時刻が異なり、例えば、階床毎の利用者の出発時間分布等を取得することができる。
<< step S14 >>
In step S14, an individual user reaches the landing user detection range, sets an ID number, and stores the assigned time (occurrence time). For this reason, the occurrence time is different for each ID number, and for example, it is possible to acquire the departure time distribution of the user for each floor and the like.
 ≪ステップS15≫
ステップS15においては、到着した乗車号機に利用者が乗車したかどうかの検出を行なう。この検出方法としては画像から利用者が消失したことを基に検出することができる。つまり、利用者検出範囲から利用者が消失すると、利用者が乗車号機に乗車したと判断するものである。もちろん、この場合も、利用者の判別情報とID情報を取得しているので、個別の利用者の乗車情報を求めることが可能である。
<< step S15 >>
In step S15, it is detected whether or not the user has got in the boarding number which has arrived. As this detection method, it can be detected based on the disappearance of the user from the image. That is, when the user disappears from the user detection range, it is determined that the user has got on the boarding number. Of course, in this case as well, since the identification information of the user and the ID information are acquired, it is possible to obtain the individual user's boarding information.
 また、乗場監視カメラ17の画像から判断するのではなく、乗りかご21に設けたかご内監視カメラ20の画像から利用者の乗車情報を得ることもできる。上述した通り、乗場監視カメラ17による利用者の判別情報とID番号が取得されているので、かご内監視カメラ20の画像による利用者の判別情報と比較することで、乗場に待機していた利用者が乗車したかどうかを判定することができる。乗車利用者の検出処理が完了するとステップS16に移行する。 Further, instead of judging from the image of the landing surveillance camera 17, it is also possible to obtain the user's boarding information from the image of the in-car surveillance camera 20 provided in the car 21. As described above, since the identification information of the user by the landing surveillance camera 17 and the ID number are acquired, the use by using the standby at the landing by comparing with the identification information of the user by the image of the in-car surveillance camera 20 It can be determined whether a person has boarded. When the detection processing of the passenger is completed, the process proceeds to step S16.
 ≪ステップS16≫
ステップS16においては、利用者が乗車号機に乗車した時刻(乗車時刻)を記憶する。この場合も、利用者の判別情報とID番号が取得されているので、ステップS15で示す通り、利用者毎に検出された乗車情報から利用者毎に乗車した時刻を取得することができる。乗車時刻が取得されるとステップS17に移行する。
<< step S16 >>
In step S16, the time (boarding time) when the user boarded the boarding number is stored. Also in this case, since the identification information of the user and the ID number are acquired, as shown in step S15, it is possible to acquire the time when the user has boarded from the boarding information detected for each user. When the boarding time is acquired, the process proceeds to step S17.
 ≪ステップS17≫
ステップS17においては、利用者が乗車した乗車号機が目的とする降車階床に到達すると、利用客の降車情報を検出する。利用者の降車検出は、到着階の乗場監視カメラ17で検出することができる。つまり、到着階で利用者が降車するが、到着階の乗場監視カメラ17によって、利用者の画像が監視カメラ制御システム16に送られる。監視カメラ制御システム16では、送られてきた画像から降車した利用者の判別情報が求められ、出発階の乗場監視カメラ17から送られてきた画像の利用者の判別情報と比較される。比較結果が同じであれば、出発階から乗車した利用者が降車したと認識して、ID番号や降車階等が記憶される。尚、かご内監視カメラ20の画像からも利用者が消失するので、利用者が降車したと推定することができる。降車利用者の検出処理が完了するとステップS18に移行する。
<< step S17 >>
In step S17, when the boarding number on which the user has boarded reaches a desired getting-off floor, the getting-off information of the user is detected. The user's getting off detection can be detected by the landing surveillance camera 17 on the arrival floor. That is, although the user alights on the arrival floor, the landing surveillance camera 17 on the arrival floor sends the image of the user to the surveillance camera control system 16. In the surveillance camera control system 16, the discrimination information of the user who got off is obtained from the sent image, and compared with the discrimination information of the user of the image sent from the landing surveillance camera 17 on the departure floor. If the comparison result is the same, it is recognized that the user getting on from the departure floor has got off, and the ID number, the getting off floor, etc. are stored. In addition, since the user disappears also from the image of the in-car monitoring camera 20, it can be estimated that the user has dismounted. When the detection processing of the getting-off user is completed, the process proceeds to step S18.
 ≪ステップS18≫
ステップS18においては、利用者が乗車号機から降車した時刻(降車時刻)を記憶する。この場合も、利用者の判別情報とID番号が取得されているので、ステップS17で示す通り、利用者毎に検出された降車情報から利用者毎に降車した時刻を取得することができる。降車時刻が取得されるとステップS19に移行する。
<< step S18 >>
In step S18, the time when the user got off the boarding machine (alighting time) is stored. Also in this case, since the identification information of the user and the ID number are acquired, as shown in step S17, it is possible to acquire the time when the user got off from the alighting information detected for each user. When the getting-off time is acquired, the process proceeds to step S19.
 ≪ステップS19≫
ステップS19においては、上述した各制御ステップを実行することによって取得された、個別の利用者の移動情報(移動履歴)をログ出力できるように、利用者の移動情報を書き換え可能なログ用記憶エリアに記憶させている。この場合、「利用者の判別情報」と、これに紐つけられた「ID番号」と、ID番号の「発生時刻(設定時刻)」と、ID番号が発生した「発生階床番号」と、乗車号機に乗車した「乗車時刻」及び「乗車階床番号」と、乗車号機から降車した「降車時刻」及び「降車階床番号」とが記憶される。尚、これ以外に利用者検出範囲、乗車号機の情報も記憶されているので、必要に応じてログ用記憶エリアに記憶させることもできる。利用者の移動情報を書き換え可能なログ用記憶エリアに記憶させるとステップS20に移行する。
«Step S19»
In step S19, a log storage area in which the user's movement information can be rewritten so that the individual user's movement information (movement history) acquired by executing each control step described above can be log-outputted. It is memorized by. In this case, "user identification information", "ID number" associated with this, "occurrence time (set time)" of the ID number, and "occurrence floor number" where the ID number is generated, The "ride time" and "ride floor number" on which the passenger board was taken, and the "disembarking time" and "land floor number off" on which the passenger board got off are stored. In addition to the above, since information on the user detection range and the boarding car is also stored, it can be stored in the log storage area as needed. When the user movement information is stored in the rewritable log storage area, the process proceeds to step S20.
 ≪ステップS20≫
ステップS20においては、個別の利用者のID番号とこれに紐つけされている判別情報とが解放(消去)され、利用者の実質的な移動情報だけが履歴として残されることとなる。ステップS20を終了すると、図6の制御フローが終了され、次の起動タイミングに備えて待機することになる。このように、利用者の移動情報を正確に把握できるので、群管理制御を行なうための事前のシミュレーションの精度を向上することができる。尚、このID解放処理は図9で詳細に説明する。
«Step S20»
In step S20, the ID number of the individual user and the determination information linked to it are released (erased), and only the substantial movement information of the user is left as a history. When step S20 is ended, the control flow of FIG. 6 is ended, and the process waits for the next start timing. As described above, since the movement information of the user can be accurately grasped, it is possible to improve the accuracy of the preliminary simulation for performing the group management control. The ID release process will be described in detail with reference to FIG.
 次に図7を用いて、図6に示すステップS12の具体的な例を説明する。図6に示すステップS11が完了すると、図7に示すステップS30~ステップS34が実行され、再びステップS13に移行する。 Next, a specific example of step S12 shown in FIG. 6 will be described with reference to FIG. When step S11 shown in FIG. 6 is completed, steps S30 to S34 shown in FIG. 7 are executed, and the process returns to step S13 again.
 ≪ステップS30≫
ステップS30においては、カメラの画像から乗場に新たに利用者を検知したかどうかが判定される。画像解析によって、個々の利用者の特定が可能であるため、現時点より前の利用者に対して新たな利用者が乗場に出現すると、これを抽出することで新たな利用者を特定できる。ステップS30では、新たな利用者が抽出されないとエンドに抜け、新たな利用者が抽出されるとステップS31に移行する。
«Step S30»
In step S30, it is determined from the image of the camera whether a user is newly detected at the landing. Since image analysis enables identification of individual users, when a new user appears on the landing for a user before the current time, this new user can be identified by extracting this. In step S30, if a new user is not extracted, the process goes to the end, and if a new user is extracted, the process proceeds to step S31.
 ≪ステップS31≫
ステップS31においては、図3に示しているように、新たな利用者Pdn-1が利用者検出範囲19Dと利用者検出範囲19Cの重複領域に存在しているかどうかを判定している。そして、新たな利用者Pdn-1が重複領域に存在していると判定されるとステップS32に移行し、新たな利用者Pdn-1が重複領域に存在していないと判定されるとステップS33に移行する。
<< step S31 >>
In step S31, as shown in FIG. 3, it is determined whether a new user Pdn-1 exists in the overlapping area of the user detection range 19D and the user detection range 19C. Then, if it is determined that a new user Pdn-1 exists in the overlapping area, the process proceeds to step S32, and if it is determined that a new user Pdn-1 does not exist in the overlapping area, step S33. Migrate to
 ≪ステップS32≫
ステップS32においては、図6に示した人ベクトル処理(ステップS12)によって、新たな利用者が属する利用者検出範囲が判定され、これによって乗車号機も判定される。これは、検出された利用者の位置座標と、利用者の向いている方向に対して任意方向を基準とする角度として数値化する。例えば、図3の利用者Pdn-1を例にすると、乗り場平面を上から俯瞰して見たレイアウトにおいて、図中上側に対して垂直に向く角度を基準とし、0°とする。そこからエレベーターの位置座標と、利用者の位置座標から、どのエレベーターに向いているか検出する。
«Step S32»
In step S32, the user detection range to which the new user belongs is determined by the human vector processing (step S12) shown in FIG. 6, and the boarding car is also determined. This is digitized as a position coordinate of the detected user and an angle based on an arbitrary direction with respect to the direction in which the user is facing. For example, in the case where the user Pdn-1 in FIG. 3 is taken as an example, in the layout viewed from above the landing plane, the angle is perpendicular to the upper side in the figure, and is 0 °. From there, it is detected from which elevator the user is facing which elevator it is facing.
 例えば、単純に、1°~90°である場合、1号機待ち、91°~180°の場合、2号機待ち、181°~270°の場合、3号機待ち、271°から0°の場合4号機待ちとする場合や、エレベーターの位置エリアを設定し、利用者の位置座標に応じて、上記の各号機待ちを検出する角度が変わる方式でも良い。図3の例の場合、利用者の向きを検出し、利用者Pdn-1の利用者は120°とすると、2号機待ちと判定される。 For example, if simply 1 ° to 90 °, wait for Unit 1; if 91 ° to 180 °, wait for Unit 2; if 181 ° to 270 °, wait for Unit 3; 4 if 271 ° to 0 ° In the case of waiting for a car, or the position area of the elevator may be set, and the angle for detecting the waiting for each car may be changed according to the position coordinates of the user. In the case of the example of FIG. 3, when the direction of the user is detected and the user of the user Pdn-1 is 120 °, it is determined that the second unit is waiting.
 このように、利用者検出範囲の再設定は、それぞれの乗車号機の待ち方向を設定し、監視カメラの画像から重複領域の利用者の待機方向を検出し、乗車号機の待ち方向と比較して、重複領域に存在する利用者の利用者検出範囲を設定することができる。 Thus, the user detection range is reset by setting the waiting direction of each boarding vehicle, detecting the waiting direction of the user of the overlapping area from the image of the monitoring camera, and comparing with the waiting direction of the boarding vehicle. The user detection range of the user existing in the overlapping area can be set.
 ≪ステップS33≫
ステップS31で新たな利用者が明確に利用者検出範囲に属していると判定されているので、ステップS33においては、この利用者検出範囲に基づき乗車号機が判定される。
«Step S33»
Since it is determined in step S31 that the new user clearly belongs to the user detection range, a boarding vehicle is determined in step S33 based on the user detection range.
 ≪ステップS34≫
ステップS34においては、重複領域の利用者と利用者検出範囲で抽出した利用者に対して乗車号機番号を付与する。一方、新たな利用者が出現し、重複領域に存在しないで利用者検出範囲に存在する場合も、利用者検出範囲で抽出した利用者に対して乗車号機番号を付与する。この処理が完了すると、図6のステップS13に移行して、図6に示す制御フローの動作を継続するものである。
«Step S34»
In step S34, the boarding number is assigned to the user of the overlapping area and the user extracted in the user detection range. On the other hand, even when a new user appears and does not exist in the overlapping area but exists in the user detection range, the boarding number is assigned to the user extracted in the user detection range. When this process is completed, the process proceeds to step S13 of FIG. 6, and the operation of the control flow shown in FIG. 6 is continued.
 次に図8を用いて、図6に示すステップS13の具体的な例を説明する。図6に示すステップS12が完了すると、図8に示すステップS40~ステップS41が実行され、再びステップS14に移行する。 Next, a specific example of step S13 shown in FIG. 6 will be described with reference to FIG. When step S12 shown in FIG. 6 is completed, steps S40 to S41 shown in FIG. 8 are executed, and the process returns to step S14 again.
 ≪ステップS40≫
ステップS40においては、利用者検出範囲に存在する利用者の判別情報が初めて検出されたものかどうかが判定される。この場合、現時点より前の利用者検出範囲に存在する利用者の判別情報は、既に監視カメラ制御システム16の記憶エリアに記憶されているので、この記憶されている判別情報と今回検出された判別情報を比較することで、初めて検出された利用者かどうかが判定できる。
«Step S40»
In step S40, it is determined whether the discrimination information of the user present in the user detection range is detected for the first time. In this case, since the discrimination information of the user existing in the user detection range before the current time is already stored in the storage area of the monitoring camera control system 16, the discrimination information stored this time and the discrimination detected this time By comparing the information, it can be determined whether the user is detected for the first time.
 初めて検出された判別情報でなければエンドに抜け、初めて検出された判別情報であればステップS41に移行する。 If the discrimination information is not detected for the first time, the process goes to the end. If the discrimination information is detected for the first time, the process proceeds to step S41.
 ≪ステップS41≫
ステップS41においては、初めて検出された利用者の判別情報に対して、新たにID番号を付与する。そして、利用者の判別情報とID番号は紐つけられて記憶されるので、ID番号を付与された利用者の移動状態をトレースすることができることになる。この処理が完了すると、図6のステップS14に移行して、図6に示す制御フローの動作を継続するものである。
<< step S41 >>
In step S41, an ID number is newly assigned to the user identification information detected for the first time. Then, since the identification information of the user and the ID number are linked and stored, it is possible to trace the movement state of the user who has been given the ID number. When this process is completed, the process proceeds to step S14 of FIG. 6, and the operation of the control flow shown in FIG. 6 is continued.
 次に図9を用いて、図6に示すステップS20の具体的な例を説明する。図6に示すステップS19が完了すると、図9に示すステップS50~ステップS51が実行されてエンドに移行する。 Next, a specific example of step S20 shown in FIG. 6 will be described with reference to FIG. When step S19 shown in FIG. 6 is completed, steps S50 to S51 shown in FIG. 9 are executed to shift to the end.
 ≪ステップS50≫
ステップS50においては、ID番号の解放処理の要否の判断を行なっている。本実施形態では、利用者の移動情報(移動履歴)を所定時間に亘ってトレースする場合と、そうでない場合とが設定されており、その例として「1日単位」で設定されている。そして、解放状条件が「1日単位」と設定されている場合はステップS51に移行し、設定されていない場合はステップS52に移行する。
«Step S50»
In step S50, it is determined whether or not the release process of the ID number is necessary. In the present embodiment, the case where the movement information (movement history) of the user is traced over a predetermined time and the case where it is not so are set, and as an example, they are set in “one day”. Then, when the release state condition is set as "one day unit", the process proceeds to step S51, and when it is not set, the process proceeds to step S52.
 ≪ステップS51≫
ステップS51では、日時が0:00:00になるまで判別情報(特徴量)とID番号を開放せずに、それぞれ紐つけして保持する。これによって、1日の時間に亘って利用者の移動状態をトレースすることが可能となる。
<< step S51 >>
In step S51, the identification information (feature amount) and the ID number are not associated with each other until the date and time becomes 0:00:00, but are associated with each other and held. This makes it possible to trace the movement of the user over the time of day.
 ≪ステップS52≫
ステップS52においては、「1日単位」未満の時間内で、所定の解放条件を満足する場合に、個別の利用者に付与されたID番号とこれに紐つけられた判別情報が解放(消去)される。尚、前にも述べたが、解放されるのは個別の利用者に付与されたID番号とこれに紐つけられた判別情報であり、これ以外のID番号の「発生時刻(設定時刻)」、ID番号が発生した「発生階床番号」、乗車号機に乗車した「乗車時刻」及び「乗車階床番号」、乗車号機から降車した「降車時刻」及び「降車階床番号」等は、シミュレーションのパラメータデータとして保持されている。
«Step S52»
In step S52, when the predetermined release condition is satisfied within the time less than "on a daily basis", the ID number assigned to the individual user and the discrimination information associated with this are released (erased) Be done. As described above, what is released is the ID number assigned to the individual user and the discrimination information linked to this, and the "occurrence time (set time)" of the other ID numbers. , “Exit floor number where ID number is generated”, “ride time when you boarded the boarding number” and “boarding floor number”, “disembarking time” when you got off the board number and “dismounting floor number” Is held as parameter data of
 ここで、ステップS52で設定される解放条件としては、(1)利用者が乗車号機を降車したことを解放条件とする、(2)エレベーターの設置計画に用いる交通計算における一周時間等の時間を用いた時間満了を解放条件とする、(3)乗り継ぎを想定した場合、第1のエレベーター群と第2のエレベーター群があった場合、第1のエレベーター群から第2のエレベーター群、或いは第2のエレベーター群から第1のエレベーター群に乗り継ぎいだとき最終的に降車したこと解放条件とする(例えば、低層から高層へ乗継があった場合、高層のエレベーターを降車したときを解放条件とする)等が考えられるので、適切な解放条件を設定すれば良いものである。 Here, as the release condition set in step S52, (1) the release condition that the user got off the boarding vehicle is (2) time such as one round time in the traffic calculation used in the installation plan of the elevator (3) Assuming that the time expiration used is a release condition, assuming that there is a first elevator group and a second elevator group, assuming the connection, the first elevator group to the second elevator group or the second elevator group Release condition when transferring from the first group of elevators to the first group of elevators (for example, when there is a transfer from the lower layer to the upper layer, the release condition is when the higher elevator is dismounted) It is possible to set an appropriate release condition because it can be considered.
 図10は、ステップS19で実行されるログ集計処理の具体例を示したものであり、乗車号機から降車する場合をID解放条件とした時のログ出力である。 FIG. 10 shows a specific example of the log tabulation process executed in step S19, and is a log output when the case of getting off the boarding vehicle is regarded as the ID release condition.
 ログ出力は、利用者を表す「利用者の判別情報」と、これに紐つけられた「ID番号」と、ID番号の「発生時刻」と、ID番号が発生した「発生階床番号」と、乗車号機に乗車した「乗車時刻」及び「乗車階床番号」と、乗車号機から降車した「降車時刻」及び「降車階床番号」とが記憶されている。 The log output includes “user identification information” representing the user, “ID number” linked to it, “occurrence time” of the ID number, and “occurrence floor number” at which the ID number is generated. A "ride time" and "ride floor number" on which the ride number boarded, and "disembarkation time" and "dismount floor number" off the ride number machine are stored.
 例えば、利用者Paに関しては、ID番号は「0001」と設定され、このID番号が「5階」で「8:00:01」に設定され、また、利用者Paは、「5階」の乗場で乗車号機に「8:01:05」に乗車し、更に、利用者Paは、「8階」の乗場で乗車号機から「8:01:20」に降車したという情報を取得できる。他の利用者についても同様であり、これらの利用者の移動情報から群管理制御を行なうための事前のシミュレーションの精度を向上することができる。 For example, for the user Pa, the ID number is set to "0001", and this ID number is set to "5:00" and "8:00:01", and the user Pa is "5th floor". It is possible to obtain information that the passenger boarding board “8:01:05” at the boarding point and that the user Pa has got off the boarding board at “eighth floor” from “boarding board” to “8:01:20”. The same applies to other users, and it is possible to improve the accuracy of the prior simulation for performing group management control from the movement information of these users.
 尚、ID解放処理が実行されると、利用者の判別情報と、これに紐つけられた「ID番号」は消去され、一般的な利用者情報(例えば、A、B、C…等)に置き換えられる。シミュレーションには、利用者の個人情報は必要ないので、利用者の判別情報のような個人情報が削除されても問題はなく、また、個人情報の保護の点からも望ましいものである。 Note that when the ID release process is executed, the user's identification information and the "ID number" associated with it are deleted, and general user information (for example, A, B, C, etc.) is deleted. Will be replaced. Since the simulation does not require the user's personal information, there is no problem if the personal information such as the user's discrimination information is deleted, and it is also desirable from the viewpoint of protection of the personal information.
 図11は、ステップS19で実行されるログ集計処理の具体例を示したものであり、「1日単位」をID解放条件とした時のログ出力である。ログ出力は、基本的には図10と同一であるが、ID解放条件が「1日単位」であるため、利用者Pcに示すように「1日単位」で移動情報が取得されている。 FIG. 11 shows a specific example of the log aggregation process executed in step S19, and is a log output when “one day unit” is an ID release condition. The log output is basically the same as in FIG. 10, but since the ID release condition is “on a day” basis, movement information is acquired on a “day basis” as shown by the user Pc.
 例えば、最初の検出処理で利用者Pcに関しては、ID番号は「0003」と設定され、このID番号が「5階」で「8:00:04」に設定され、また、利用者Pcは、「5階」の乗場で乗車号機に「8:01:06」に乗車し、更に、利用者Pcは、「9階」の乗場で乗車号機から「8:01:55」に降車して事務所に向かったという情報を取得できる。 For example, in the first detection process, the ID number is set to "0003" for the user Pc, and this ID number is set to "8:00:04" on the "5th floor", and the user Pc is Board the boarding number at 8:01:06 on the 5th floor and further, the user Pc gets off from the boarding platform on the 9th floor at 8:01:55. You can get information that you went to the office.
 また、利用者Pcが9回の事務所で執務して時間が経過した後でも、利用者PcのID番号は「0003」と保持されており、このID番号は「9階」で「10:10:25」に再検出され、また、利用者Pcは、「9階」の乗場で乗車号機に「10:10:36」に乗車し、更に、利用者Pcは、「1階」の乗場で乗車号機から「10:11:03」に降車したという情報を取得できる。他の利用者についても同様であり、これらの利用者の移動情報から群管理制御を行なうための事前のシミュレーションの精度を向上することができる。更に、IDの解放条件を設定することで、プライバシーに配慮し、いかなる利用シーンでもシミュレーションの精度を保つためのデータを作成することが可能となる。 In addition, even after the user Pc works for nine times in the office and the time elapses, the ID number of the user Pc is held as "0003", and this ID number is "9th floor" and "10: The user Pc is detected again at 10:25 ", and the user Pc takes" 10:10:36 "to the boarding machine at the" 9th floor "platform, and the user Pc also operates the" 1st floor "platform You can get the information that you got off at "10:11:03" from the boarding vehicle. The same applies to other users, and it is possible to improve the accuracy of the prior simulation for performing group management control from the movement information of these users. Furthermore, by setting the release condition of the ID, it is possible to create data for maintaining the simulation accuracy in any usage scene in consideration of privacy.
 尚、本実施形態では上述した利用者を表す「利用者の判別情報」と、これに紐つけられた「ID番号」と、ID番号の「発生時刻」と、ID番号が発生した「発生階床番号」と、乗車号機に乗車した「乗車時刻」及び「乗車階床番号」と、乗車号機から降車した「降車時刻」及び「降車階床番号」とが記憶されているが、必要に応じてこれ以外の記憶項目を設定することも可能である。 In the present embodiment, the “user identification information” representing the above-described user, the “ID number” associated with the user, the “occurrence time” of the ID number, and the “occurrence floor where the ID number is generated. The floor number, the boarding time on which the boarding machine was boarded, the boarding floor number, and the boarding time on which the boarding board was dismounted are stored. It is also possible to set other storage items.
 上述の実施形態は、各階床の乗場に設けた乗場監視カメラ17と乗りかご内に設けたかご内監視カメラ20の画像を監視カメラ制御システム16で総合的に管理、制御している例を示したが、各階床の乗場に設けた乗場監視カメラ17だけを使用して利用者の移動情報を取得しても良いし、また、乗りかごに設けたかご監視カメラ20だけを使用して利用者の移動情報を取得しても良いものである。 The above-described embodiment shows an example in which the surveillance camera control system 16 comprehensively manages and controls the images of the hall surveillance camera 17 provided at the landing of each floor and the in-car surveillance camera 20 provided in the car. However, the user's movement information may be acquired using only the landing monitoring camera 17 provided at the landing of each floor, or the user using only the car monitoring camera 20 provided in the car. You may acquire movement information of
 各階床の乗場に設けた乗場監視カメラ17だけを使用して利用者の移動情報を取得する場合は、基本的には図5に示す制御フローと同じ制御で実現できるものである。この場合は、ステップS15の乗車利用者検出処理とステップS17の降車利用者検出処理は、乗車する乗場監視カメラ17の画像、及び降車する乗場監視カメラ17の画像から利用者の挙動を検出することで実施できる。つまり、乗車する場合は、利用者検出範囲から利用者が消失したことで検出することができ、降車する場合は、利用者検出範囲に利用者が出現したことで検出することができる。 When acquiring user's movement information using only the landing surveillance camera 17 provided at the landing of each floor, it can be basically realized by the same control as the control flow shown in FIG. In this case, the boarding user detection process of step S15 and the getting-off user detection process of step S17 detect the behavior of the user from the image of the boarding surveillance camera 17 on board and the image of the boarding surveillance camera 17 aboard Can be implemented by That is, when getting on the vehicle, it can be detected when the user disappears from the user detection range, and when getting off, it can be detected when the user appears in the user detection range.
 また、乗りかごに設けたかご内監視カメラ20だけを使用して利用者の移動情報を取得する場合は、かご内監視カメラ20で検出した、個別の利用者の判別情報とID番号を紐つけ、利用者が乗りかご内に出現した出現時刻と、乗りかご内から消失した消失時刻をID番号毎に記憶させる。そして、乗車号機情報と時刻情報からID番号毎に出発階と到着階を紐つけて、ログ出力を取得することができる。 In addition, when acquiring movement information of the user using only the in-car monitoring camera 20 provided in the car, the identification information of the individual user detected by the in-car monitoring camera 20 is associated with the ID number The time of appearance when the user appears in the car and the time of disappearance when the user has disappeared from the car are stored for each ID number. Then, the departure floor and the arrival floor can be linked from the board number information and the time information for each ID number, and the log output can be acquired.
 以上述べた通り、本発明はエレベーターの乗場を撮影する乗場カメラ、或いは乗りかご内を撮影するかご内カメラの画像から、利用者のそれぞれに利用者を判別する判別情報とID情報を設定、付与し、この判別情報とID情報を基に利用者の移動をトレースして少なくとも利用者の乗車階床と降車階床を紐つけて記憶させて利用者の移動情報を把握する構成としている。 As described above, according to the present invention, the identification information and ID information for identifying the user are set and given to each of the users from the image of the hall camera for photographing the elevator hall or the in-car camera for photographing the inside of the car The movement of the user is traced based on the discrimination information and the ID information, and at least the user's boarding floor and the getting-off floor are linked and stored, and the user's movement information is grasped.
 これによれば、利用者の移動情報を正確に把握できるので、例えば、群管理制御を行なうための事前のシミュレーションに必要となる正確なパラメータデータが得られるので、シミュレーションの精度を向上できる。 According to this, since the movement information of the user can be accurately grasped, for example, accurate parameter data necessary for the simulation in advance for performing the group management control can be obtained, so that the accuracy of the simulation can be improved.
 尚、本発明は、上述した実施例に限定するものではなく、様々な変形例が含まれる。例えば、上述した実施例は本発明を分かり易く説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定するものではない。またある実施例の構成の一手段を他の実施例の構成に置き換えることが可能であり、また、ある実施例の構成に他の実施例の構成を加えることも可能である。また、各実施例の構成の一手段について、他の構成の追加、削除、置換をすることが可能である。 The present invention is not limited to the embodiments described above, but includes various modifications. For example, the embodiments described above are described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described. In addition, one means of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. In addition, it is possible to add, delete, and replace other configurations with respect to one means of the configurations of the respective embodiments.
 10…エレベーター運行管理システム、10A…学習部、10B…受信部、10C…階床別人数評価部、10D…総合評価部、10E…割当て指令部、11A~11N…乗車号機エレベーター制御システム、12…通信網、16…監視カメラ制御システム、16A…画像入力処理部、16B…検出範囲設定処理部、16C…利用者検出処理部、16D…ID設定処理部、16E…階床別利用者検出処理部、16F…移動情報出力処理部、16G…ログ出力処理部、17…乗場監視カメラ、18A~18D…乗場戸、19A~19D…利用者検出範囲、20…かご内監視カメラ。 DESCRIPTION OF SYMBOLS 10 ... Elevator operation control system, 10A ... Learning part, 10B ... Reception part, 10C ... Number-of-people evaluation part according to a floor, 10D ... Comprehensive evaluation part, 10E ... Allocation command part, 11A-11N ... Ride number elevator control system, 12 ... Communication network 16 Monitoring camera control system 16A Image input processing unit 16B Detection range setting processing unit 16C User detection processing unit 16D ID setting processing unit 16E Floor-specific user detection processing unit , 16F: movement information output processing unit, 16G: log output processing unit, 17: landing surveillance camera, 18A to 18D: landing door, 19A to 19D: user detection range, 20: in-car surveillance camera.

Claims (14)

  1.  複数のエレベーターの乗車号機の乗場を撮影する乗場カメラの画像を解析して、前記乗場の利用者の移動情報を把握するエレベーター利用ログ出力システムであって、
     前記エレベーター利用ログ出力システムは、
     前記乗場カメラの画像から、個別の利用者を判別する判別情報を検出する利用者検出手段と、
     前記判別情報に対応して個別のID番号を設定して付与するID番号設定手段と、
     前記判別情報と前記ID番号とから個別の利用者の移動情報を検出する移動情報検出手段と、
     前記移動情報検出手段によって検出された前記移動情報を、前記ID番号に対応して記憶する記憶手段とを備えている
    ことを特徴とするエレベーター利用ログ出力システム。
    An elevator-based log output system for analyzing an image of a hall camera that captures a hall of a plurality of elevator cars and grasping movement information of a user of the hall,
    The elevator usage log output system
    User detection means for detecting discrimination information for discriminating individual users from the image of the landing camera;
    ID number setting means for setting and giving individual ID numbers corresponding to the discrimination information;
    Movement information detection means for detecting movement information of an individual user from the discrimination information and the ID number;
    And a storage means for storing the movement information detected by the movement information detection means in correspondence with the ID number.
  2.  請求項1に記載のエレベーター利用ログ出力システムであって、
     前記移動情報検出手段は、
     前記乗場から乗車号機に乗車する個別の利用者の前記判別情報と、前記乗車号機から他の前記乗場に降車する個別の利用者の前記判別情報とを比較して、乗車時の前記判別情報と降車時の前記判別情報が一致する個別の利用者を特定する利用者認識手段を備え、前記利用者認識手段で認識された個別の利用者の前記移動情報を検出する
    ことを特徴とするエレベーター利用ログ出力システム。
    It is an elevator utilization log output system of Claim 1, Comprising:
    The movement information detection means
    The discrimination information at the time of boarding is compared with the discrimination information of the individual user who gets on the boarding machine from the platform and the discrimination information of the individual user who gets off the boarding system to the other platform Use of an elevator characterized by comprising user recognition means for specifying an individual user who agrees with the discrimination information at the time of getting off, and detecting the movement information of the individual user recognized by the user recognition means Log output system.
  3.  請求項2に記載のエレベーター利用ログ出力システムであって、
     前記エレベーター利用ログ出力システムは、
     前記乗場カメラの画像から前記乗場の前記乗車号機毎に利用者を抽出する利用者検出範囲を設定する利用者検出範囲設定手段を備えており、
     前記利用者検出手段は、前記利用者検出範囲設定手段で設定された前記乗車号機毎の前記利用者検出範囲から個別の利用者の前記判別情報を抽出する
    ことを特徴とするエレベーター利用ログ出力システム。
    It is an elevator utilization log output system of Claim 2, Comprising:
    The elevator usage log output system
    A user detection range setting unit configured to set a user detection range for extracting a user for each of the boarding cars in the hall from the image of the hall camera;
    The elevator usage log output system, wherein the user detection means extracts the discrimination information of the individual user from the user detection range for each of the passenger cars set by the user detection range setting means. .
  4.  請求項3に記載のエレベーター利用ログ出力システムであって、
     前記エレベーター利用ログ出力システムは、
     前記利用者検出範囲設定手段によって設定された隣接する前記利用者検出範囲の重複領域に利用者が存在する場合に、前記重複領域の利用者の画像を解析して前記重複領域に存在する利用者の前記利用者検出範囲を設定する利用者検出範囲再設定手段を備えている
    ことを特徴とするエレベーター利用ログ出力システム。
    It is an elevator utilization log output system of Claim 3, Comprising:
    The elevator usage log output system
    When there is a user in the overlapping area of the adjacent user detection range set by the user detection range setting means, the user of the overlapping area is analyzed and the user existing in the overlapping area is analyzed An elevator utilization log output system comprising a user detection range resetting means for setting the user detection range of (4).
  5.  請求項4に記載のエレベーター利用ログ出力システムであって、
     前記利用者検出範囲再設定手段は、それぞれの前記乗車号機の待ち方向を設定し、前記乗場カメラの画像から前記重複領域の利用者の待機方向を検出し、前記乗車号機の待ち方向と比較して、前記重複領域に存在する利用者の前記利用者検出範囲を設定する
    ことを特徴とするエレベーター利用ログ出力システム。
    It is an elevator utilization log output system of Claim 4, Comprising:
    The user detection range resetting means sets the waiting direction of each of the boarding cars, detects the waiting direction of the user of the overlapping area from the image of the landing camera, and compares it with the waiting direction of the boarding cars. The elevator use log output system, wherein the user detection range of the user existing in the overlapping area is set.
  6.  請求項2に記載のエレベーター利用ログ出力システムであって、
     前記エレベーター利用ログ出力システムは、
     前記記憶手段に前記移動情報が記憶されると、所定の解放条件にしたがって個別の利用者の前記ID番号と、これに紐つけられた前記判別情報を解放するID番号開放手段を備えている
    ことを特徴とするエレベーター利用ログ出力システム。
    It is an elevator utilization log output system of Claim 2, Comprising:
    The elevator usage log output system
    When the movement information is stored in the storage unit, the ID number release unit releases the ID number of the individual user according to a predetermined release condition and the identification information associated with the ID number. Elevator usage log output system characterized by.
  7.  複数のエレベーターの乗車号機の乗場を撮影する乗場カメラの画像を解析して、前記乗場の利用者の移動情報を把握するエレベーター利用ログ出力方法であって、
     前記エレベーター利用ログ出力方法は、
     前記乗場カメラの画像から、個別の利用者を判別する判別情報を検出し、
     前記判別情報に対応して個別のID番号を設定し、
     前記判別情報と前記ID番号とから個別の利用者の移動情報を検出し、
     検出された前記移動情報を前記ID番号に対応して記憶する
    ことを特徴とするエレベーター利用ログ出力方法。
    An elevator usage log output method for analyzing an image of a hall camera which captures a hall of a plurality of elevator cars and grasping movement information of a user of the hall,
    The elevator usage log output method is
    From the image of the landing camera, discrimination information for discriminating an individual user is detected;
    Set an individual ID number corresponding to the discrimination information,
    The movement information of an individual user is detected from the discrimination information and the ID number,
    And storing the detected movement information in correspondence with the ID number.
  8.  請求項7に記載のエレベーター利用ログ出力方法であって、
     前記移動情報を検出する場合は、
     前記乗場から乗車号機に乗車する個別の利用者の前記判別情報と、前記乗車号機から他の前記乗場に降車する個別の利用者の前記判別情報とを比較し、乗車時の前記判別情報と降車時の前記判別情報が一致する個別の利用者を特定して前記移動情報を検出する
    ことを特徴とするエレベーター利用ログ出力方法。
    The elevator use log output method according to claim 7, wherein
    When detecting the movement information,
    The discrimination information of the individual user who gets on the boarding machine from the hall and the discrimination information of the individual user who gets off from the boarding machine to the other hall are compared, and the discrimination information at the time of boarding and alighting A method for outputting an elevator usage log, comprising: identifying an individual user with whom the discrimination information coincides with one another and detecting the movement information.
  9.  請求項8に記載のエレベーター利用ログ出力方法であって、
     前記エレベーター利用ログ出力方法は、
     前記乗場カメラの画像から前記乗場の前記乗車号機毎に利用者を抽出する利用者検出範囲を設定し、
     設定された前記乗車号機毎の前記利用者検出範囲から個別の利用者の前記判別情報を抽出する
    ことを特徴とするエレベーター利用ログ出力方法。
    The elevator use log output method according to claim 8, wherein
    The elevator usage log output method is
    Setting a user detection range for extracting a user for each of the boarding cars in the hall from the image of the hall camera;
    A method of outputting an elevator usage log, characterized in that the discrimination information of an individual user is extracted from the set user detection range for each boarding number set.
  10.  請求項9に記載のエレベーター利用ログ出力方法であって、
     前記エレベーター利用ログ出力方法は、
     隣接する前記利用者検出範囲の重複領域に利用者が存在する場合に、前記重複領域の利用者の画像を解析して前記重複領域に存在する利用者の前記利用者検出範囲を再設定する
    ことを特徴とするエレベーター利用ログ出力方法。
    It is the elevator utilization log output method according to claim 9,
    The elevator usage log output method is
    Analyzing the image of the user of the overlapping area when the user exists in the overlapping area of the adjacent user detection area, and resetting the user detection area of the user existing in the overlapping area; Elevator usage log output method characterized by
  11.  請求項10に記載のエレベーター利用ログ出力方法であって、
     前記エレベーターの上下方向人数検出方法は、
     それぞれの前記乗車号機の待ち方向を設定して前記乗場カメラの画像から前記重複領域の利用者の待機方向を検出し、
     前記乗車号機の待ち方向と比較して、前記重複領域に存在する利用者の前記利用者検出範囲を設定する
    ことを特徴とするエレベーター利用ログ出力方法。
    11. The elevator usage log output method according to claim 10, wherein
    The method of detecting the number of persons in the vertical direction of the elevator is
    Setting the waiting direction of each of the boarding cars and detecting the waiting direction of the user of the overlapping area from the image of the landing camera;
    An elevator usage log output method, wherein the user detection range of the user existing in the overlapping area is set in comparison with the waiting direction of the boarding car.
  12.  請求項8に記載のエレベーター利用ログ出力方法であって、
     前記エレベーター利用ログ出力方法は、
     前記移動情報が記憶されると、所定の解放条件にしたがって、個別の利用者の前記ID番号と、これに紐つけられた前記判別情報を解放する
    ことを特徴とするエレベーター利用ログ出力方法。
    The elevator use log output method according to claim 8, wherein
    The elevator usage log output method is
    When the movement information is stored, the elevator utilization log output method is characterized in that the ID number of the individual user and the discrimination information linked thereto are released according to a predetermined release condition.
  13.  複数のエレベーターの乗車号機の乗りかご内を撮影するかご内カメラの画像を解析して、前記乗りかご内の利用者の移動情報を把握するエレベーター利用ログ出力システムであって、
     前記エレベーター利用ログ出力システムは、
     前記かご内カメラの画像から、個別の利用者を判別する判別情報を検出する利用者検出手段と、
     前記判別情報に対応して個別のID番号を設定して付与するID番号設定手段と、
     前記判別情報と前記ID番号とから個別の利用者の移動情報を検出する移動情報検出手段と、
     前記移動情報検出手段によって検出された前記移動情報を、前記ID番号に対応して記憶する記憶手段とを備えている
    ことを特徴とするエレベーター利用ログ出力システム。
    An elevator-use log output system for analyzing an image of a camera in a car for capturing the inside of a car of a plurality of elevator cars and for grasping movement information of a user in the car,
    The elevator usage log output system
    User detection means for detecting discrimination information for discriminating individual users from the image of the in-car camera;
    ID number setting means for setting and giving individual ID numbers corresponding to the discrimination information;
    Movement information detection means for detecting movement information of an individual user from the discrimination information and the ID number;
    And a storage means for storing the movement information detected by the movement information detection means in correspondence with the ID number.
  14.  複数のエレベーターの乗車号機の乗りかご内を撮影するかご内カメラの画像を解析して、前記乗りかご内の利用者の移動情報を把握するエレベーター利用ログ出力方法であって、
     前記エレベーター利用ログ出力方法は、
     前記かご内カメラの画像から、個別の利用者を判別する判別情報を検出し、
     前記判別情報に対応して個別のID番号を設定し、
     前記判別情報と前記ID番号とから個別の利用者の移動情報を検出し、
     検出された前記移動情報を前記ID番号に対応して記憶する
    ことを特徴とするエレベーター利用ログ出力方法。
    An elevator usage log output method for analyzing an image of an in-car camera for capturing the inside of a car of a plurality of elevator cars, and for grasping movement information of a user in the car;
    The elevator usage log output method is
    The discrimination information for discriminating the individual user is detected from the image of the in-car camera,
    Set an individual ID number corresponding to the discrimination information,
    The movement information of an individual user is detected from the discrimination information and the ID number,
    And storing the detected movement information in correspondence with the ID number.
PCT/JP2017/039131 2017-10-30 2017-10-30 Elevator usage log output system, and elevator usage log output method WO2019087251A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2017/039131 WO2019087251A1 (en) 2017-10-30 2017-10-30 Elevator usage log output system, and elevator usage log output method
CN201780095835.1A CN111212802B (en) 2017-10-30 2017-10-30 Elevator use log output system and elevator use log output method
JP2019550004A JP7005648B2 (en) 2017-10-30 2017-10-30 Elevator usage log output system and elevator usage log output method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/039131 WO2019087251A1 (en) 2017-10-30 2017-10-30 Elevator usage log output system, and elevator usage log output method

Publications (1)

Publication Number Publication Date
WO2019087251A1 true WO2019087251A1 (en) 2019-05-09

Family

ID=66331617

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/039131 WO2019087251A1 (en) 2017-10-30 2017-10-30 Elevator usage log output system, and elevator usage log output method

Country Status (3)

Country Link
JP (1) JP7005648B2 (en)
CN (1) CN111212802B (en)
WO (1) WO2019087251A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111612814A (en) * 2020-02-04 2020-09-01 北京旷视科技有限公司 Method, device and electronic system for identifying and tracking heat-generating personnel
WO2020261560A1 (en) * 2019-06-28 2020-12-30 三菱電機株式会社 Building management system
JP2021066575A (en) * 2019-10-25 2021-04-30 株式会社日立製作所 Elevator system
JP7398913B2 (en) 2019-09-24 2023-12-15 株式会社電通国際情報サービス Mobile movement management system
EP4082955A4 (en) * 2019-12-26 2024-01-24 Hitachi Ltd Architectural model data assistance system and architectural model data assistance method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09278301A (en) * 1996-04-11 1997-10-28 Mitsubishi Electric Corp Operation control device for elevator
JP2003276963A (en) * 2003-02-03 2003-10-02 Toshiba Corp Elevator controller by use of image monitoring device
JP2004048519A (en) * 2002-07-15 2004-02-12 Hitachi Ltd Burglar alarm device
JP2007186300A (en) * 2006-01-13 2007-07-26 Toshiba Elevator Co Ltd Entrance lock interlocking operation elevator control system
JP2012113367A (en) * 2010-11-19 2012-06-14 Nikon Corp Guidance system
JP2013500221A (en) * 2009-07-28 2013-01-07 マリミルス オーワイ A system for controlling an elevator in an elevator system
JP2013175049A (en) * 2012-02-24 2013-09-05 Toshiba Elevator Co Ltd Number of person measurement device for elevator and elevator having number of person measurement device and elevator system in which each of two or more elevators has number of person measurement device
JP2014219913A (en) * 2013-05-10 2014-11-20 技研トラステム株式会社 Apparatus for counting number of getting on/off passengers

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103221984B (en) * 2010-11-19 2016-10-05 株式会社尼康 Guide, detection device and posture state decision maker
CN203794388U (en) * 2013-12-02 2014-08-27 山东省射频识别应用工程技术研究中心有限公司 Elevator operation monitoring and pre-warning system
EP3227828B1 (en) * 2014-12-03 2023-10-04 Inventio Ag System and method for alternatively interacting with elevators
CN104787635B (en) * 2015-04-24 2017-02-22 宁夏电通物联网科技股份有限公司 Elevator floor data collecting device and elevator floor operation monitoring and controlling system and method
CN105173930B (en) * 2015-06-17 2017-04-12 厦门乃尔电子有限公司 Intelligent elevator system based on mobile terminal and intelligent elevator riding method
CN106553941B (en) * 2015-09-30 2019-07-05 腾讯科技(深圳)有限公司 A kind of intelligent elevator group control method and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09278301A (en) * 1996-04-11 1997-10-28 Mitsubishi Electric Corp Operation control device for elevator
JP2004048519A (en) * 2002-07-15 2004-02-12 Hitachi Ltd Burglar alarm device
JP2003276963A (en) * 2003-02-03 2003-10-02 Toshiba Corp Elevator controller by use of image monitoring device
JP2007186300A (en) * 2006-01-13 2007-07-26 Toshiba Elevator Co Ltd Entrance lock interlocking operation elevator control system
JP2013500221A (en) * 2009-07-28 2013-01-07 マリミルス オーワイ A system for controlling an elevator in an elevator system
JP2012113367A (en) * 2010-11-19 2012-06-14 Nikon Corp Guidance system
JP2013175049A (en) * 2012-02-24 2013-09-05 Toshiba Elevator Co Ltd Number of person measurement device for elevator and elevator having number of person measurement device and elevator system in which each of two or more elevators has number of person measurement device
JP2014219913A (en) * 2013-05-10 2014-11-20 技研トラステム株式会社 Apparatus for counting number of getting on/off passengers

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020261560A1 (en) * 2019-06-28 2020-12-30 三菱電機株式会社 Building management system
JPWO2020261560A1 (en) * 2019-06-28 2021-11-25 三菱電機株式会社 Building management system
CN113993804A (en) * 2019-06-28 2022-01-28 三菱电机株式会社 Building management system
JP7398913B2 (en) 2019-09-24 2023-12-15 株式会社電通国際情報サービス Mobile movement management system
JP2021066575A (en) * 2019-10-25 2021-04-30 株式会社日立製作所 Elevator system
EP4082955A4 (en) * 2019-12-26 2024-01-24 Hitachi Ltd Architectural model data assistance system and architectural model data assistance method
CN111612814A (en) * 2020-02-04 2020-09-01 北京旷视科技有限公司 Method, device and electronic system for identifying and tracking heat-generating personnel

Also Published As

Publication number Publication date
CN111212802B (en) 2021-06-29
CN111212802A (en) 2020-05-29
JP7005648B2 (en) 2022-01-21
JPWO2019087251A1 (en) 2020-10-22

Similar Documents

Publication Publication Date Title
WO2019087251A1 (en) Elevator usage log output system, and elevator usage log output method
CN109292579B (en) Elevator system, image recognition method and operation control method
EP3041775B1 (en) Elevator dispatch using facial recognition
JP5988472B2 (en) Monitoring system and congestion rate calculation method
CN110861983B (en) Elevator operation control method and device
CN106915672B (en) Elevator group management control device, group management system, and elevator system
CN101506077B (en) Anonymous passenger indexing system for security tracking in destination entry dispatching operations
US10259681B2 (en) Elevator dispatch using fingerprint recognition
CN109311622B (en) Elevator system and car call estimation method
JP2009523678A (en) Video assisted system for elevator control
JP6970206B2 (en) Elevator operation management system and operation management method
CN111225866B (en) Automatic call registration system and automatic call registration method
KR20200075378A (en) Monitoring system for building occupant density using cctv, and method for the same
CN107000960B (en) Evacuation controller
JP2017052578A (en) Boarding-off situation prediction presentation method at arrival of car for elevator, and device
JP5596423B2 (en) Elevator control system
JP2012246115A (en) Device and program for visualizing enter/exit history information
JP7333773B2 (en) Elevator system and operation control method for elevator device
CN104671020A (en) Elevator system
CN109626154A (en) A kind of assisting safeguard of public order elevator control method, system and elevator
WO2019087710A1 (en) System for detecting number of persons bound upward/downward in elevator, and method for detecting number of persons bound upward/downward
CN112699843A (en) Identity recognition method and system
KR102580367B1 (en) Control method for the access control system for visitors to apartment house
JP7281550B2 (en) Train monitoring system and train monitoring method
WO2022064567A1 (en) Vertical-lift mechanism monitoring system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17930338

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019550004

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17930338

Country of ref document: EP

Kind code of ref document: A1