CN111212802B - Elevator use log output system and elevator use log output method - Google Patents

Elevator use log output system and elevator use log output method Download PDF

Info

Publication number
CN111212802B
CN111212802B CN201780095835.1A CN201780095835A CN111212802B CN 111212802 B CN111212802 B CN 111212802B CN 201780095835 A CN201780095835 A CN 201780095835A CN 111212802 B CN111212802 B CN 111212802B
Authority
CN
China
Prior art keywords
user
elevator
information
discrimination information
detection range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780095835.1A
Other languages
Chinese (zh)
Other versions
CN111212802A (en
Inventor
羽鸟贵大
藤原正康
小町章
星野孝道
鸟谷部训
加藤学
藤野笃哉
鸟海涉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Publication of CN111212802A publication Critical patent/CN111212802A/en
Application granted granted Critical
Publication of CN111212802B publication Critical patent/CN111212802B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B11/00Main component parts of lifts in, or associated with, buildings or other structures
    • B66B11/02Cages, i.e. cars
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B3/00Applications of devices for indicating or signalling operating conditions of elevators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B5/00Applications of checking, fault-correcting, or safety devices in elevators

Abstract

Provided are an elevator use log output system and an elevator use log output method, which can accurately grasp the movement information of a user. The movement information of a user is accurately grasped by setting discrimination information and ID information for discriminating the user for each user (Pa to Pd) based on an image of a camera (17) for photographing an elevator hall of an elevator or a camera (20) for photographing an elevator car, tracking the movement of the user (Pa to Pd) based on the discrimination information and the ID information, and storing at least the riding floor and the elevator-leaving floor of the user in association with each other. This makes it possible to accurately grasp the movement information of the user, and thus, for example, the accuracy of a preliminary simulation for performing group management control can be improved.

Description

Elevator use log output system and elevator use log output method
Technical Field
The present invention relates to an elevator log output system and an elevator log output method for accurately grasping movement information of a user who uses an elevator.
Background
In a relatively large-scale building, a plurality of elevators are installed at the same time in order to improve the user transportation ability of the elevator, and a system for selecting the most appropriate car to serve at the time of call registration of an elevator hall is introduced. Further, as the scale of the building becomes larger, the number of elevators to be installed at the same time becomes larger, and these plural elevators are appropriately controlled by the group management control device, thereby achieving improvement in service such as reduction in waiting time for the user.
In an elevator using such a group management control device, in order to appropriately control each elevator, the elevator is controlled by measuring the behavior of a user or the like. For example, the following systems are proposed in japanese patent application laid-open No. 2010-254391 (patent document 1), japanese patent application laid-open No. 2003-221174 (patent document 2), japanese patent application laid-open No. 2006-21852 (patent document 3), and the like.
In the elevator system disclosed in patent document 1, a detector for detecting a pulled-in event is provided around the elevator hall door, and a history recording unit for recording the pulled-in event detected by the detector is provided. The elevator hall door control device further includes a detected number counting unit that counts the number of detected events recorded by the history unit, and controls opening and closing of the elevator hall door based on the number of detected events in a predetermined period. This prevents the elevator door from being caught or pulled into the elevator door pocket.
In the elevator system disclosed in patent document 2, a monitoring camera is installed in a car, and a camera image is transmitted to an elevator maintenance company at a predetermined timing to confirm a user. This makes it possible to easily confirm the final person who leaves the building or the person who enters the building at night from a distance, thereby improving the safety function of the building.
Further, in the elevator system disclosed in patent document 3, a video recording means for recording a video of the car captured by a monitoring camera is provided, and when a signal relating to an event with a high possibility of crime is received, the video of the car in a predetermined time period recorded in the video recording means is stored together with identification data indicating reception time data and each event. This saves the recording capacity of the video in the car, and enables a quick search for the video when an event occurs.
Prior art documents
Patent document
Patent document 1: japanese laid-open patent publication No. 2010-254391
Patent document 2: japanese laid-open patent publication No. 2003-221174
Patent document 3: japanese patent laid-open No. 2006 and 21852
Disclosure of Invention
Problems to be solved by the invention
However, in an elevator provided with such a group management control device, in order to effectively operate the elevator, it is often required to perform a simulation operation based on a simulation in advance. For this reason, it is necessary to accurately grasp the movement information of the user (movement in the upward or downward direction of the building, the number of users, riding of the car, waiting time, departure floor, arrival floor, and the like). In this way, if the movement information of the user can be accurately grasped, accurate parameter data necessary for the simulation in advance for performing the group management control can be obtained, and therefore, the accuracy of the simulation can be improved. Therefore, it is strongly required for the group management control device to accurately grasp the movement information of the user.
The invention aims to provide an elevator use log output system and an elevator use log output method which can accurately master the movement information of a user.
Means for solving the problems
The present invention is characterized in that discrimination information and ID information for discriminating a user are set and given to each user based on an image of an elevator hall camera for photographing an elevator hall of an elevator or an image of a camera in a car in the car, the movement of the user is tracked based on the discrimination information and the ID information, and at least a riding floor and a boarding floor of the user are associated and stored, thereby grasping the movement information of the user.
Effects of the invention
According to the present invention, since the movement information of the user can be accurately grasped, accurate parameter data necessary for a preliminary simulation for performing group management control can be obtained, for example, and therefore the accuracy of the simulation can be improved.
Drawings
Fig. 1 is a schematic configuration diagram showing an elevator operation management system and a monitoring camera control system according to a first embodiment of the present invention.
Fig. 2 is an external view of an elevator hall as viewed from obliquely above.
Fig. 3 is an explanatory diagram showing a user detection range set for each riding car.
Fig. 4 is an explanatory diagram for explaining the movement of the user riding the car from level 4 to level 1.
Fig. 5A is a diagram illustrating the movement of the user, and is an explanatory diagram illustrating an elevator hall at 5 floors before the user gets into the car.
Fig. 5B is a diagram illustrating the movement of the user, and is an explanatory diagram illustrating an elevator hall of 5 floors after the user gets on the car.
Fig. 5C is a diagram illustrating movement of a user, and is an explanatory diagram showing an elevator lobby of 7 floors in which one user has left the elevator.
Fig. 5D is a diagram illustrating movement of a user, and is an explanatory diagram illustrating an 8-story elevator hall in which two users have left the elevator.
Fig. 5E is a diagram illustrating movement of a user, and is an explanatory diagram showing an elevator lobby of 9 floors in which one user has left the elevator.
Fig. 5F is an explanatory view showing an elevator hall of 9 floors before a new user takes a ride.
Fig. 6 is a flowchart illustrating a method of acquiring movement information of a user of an elevator according to an embodiment of the present invention.
Fig. 7 is a detailed control flow of the control step S12 shown in fig. 6.
Fig. 8 is a detailed control flow of the control step S13 shown in fig. 6.
Fig. 9 is a detailed control flow of the control step S20 shown in fig. 6.
Fig. 10 is a diagram showing specific data of the log output executed in the control step S19 shown in fig. 5.
Fig. 11 is a diagram showing other specific data of the log output executed in the control step S19 shown in fig. 5.
Detailed Description
Next, embodiments of the present invention will be described in detail with reference to the drawings, but the present invention is not limited to the following embodiments, and various modifications and application examples are included in the technical concept of the present invention.
Fig. 1 is a schematic configuration diagram showing an elevator that performs group management control according to a first embodiment of the present invention.
An elevator operation management system 10 functioning as a group management control device is connected to riding car elevator control systems 11A to 11N that control riding cars of a plurality of elevators, and the riding car elevator control systems 11A to 11N have a function of receiving a control command from the elevator operation management system 10 and actually performing an operation of the riding cars. For example, a motor of a hoist on a car, a brake mechanism of the hoist, a door opening/closing motor, and the like are controlled.
The elevator operation management system 10 is connected to an elevator hall elevator service request device 13, a building management system 14, a public institution management system 15, and a monitoring camera control system 16 via a communication network 12. The elevator hall elevator service request device 13, the building management system 14, and the public institution management system 15 are not related to the present embodiment, and therefore, detailed description thereof is omitted. The system configuration shown in fig. 1 is applied as a dedicated system constructed for each building or each of a plurality of buildings.
The elevator operation management system 10 according to the present embodiment is composed of a learning unit 10A, a receiving unit 10B, a floor-based number-of-persons evaluation unit 10C, a comprehensive evaluation unit 10D, and an assignment instruction unit 10E, and each of these units can be realized as a control function of a computer.
The receiving unit 10B is connected to the communication network 12, and receives various kinds of related information from the monitoring camera control system 16. The received various pieces of related information are transmitted to the learning unit 10A, and the various pieces of related information are learned by executing rewriting processing and the like.
The received various pieces of related information are sent to the floor-based number-of-people evaluation unit 10C, where a predetermined evaluation calculation process is executed. The evaluation calculation result is sent to the comprehensive evaluation unit 10D, and the comprehensive evaluation calculation is executed together with other evaluation calculation parameters. The operation assignment information of the riding car calculated by the comprehensive evaluation unit 10D is transmitted to the assignment command unit 10E, and the assignment command unit 10E transmits a control command to the corresponding riding car elevator systems 11A to 11N to cause the riding car elevator systems 11A to 11N to execute a predetermined function.
On the other hand, the monitoring camera control system 16, which is a feature of the present embodiment, is configured by the monitoring camera image input processing unit 16A, the detection range setting processing unit 16B, the user detection processing unit 16C, ID setting processing unit 16D, the floor-by-floor user detection processing unit 16E, the movement information output processing unit 16F, and the log output processing unit 16G, and these can also be realized as a control function of a computer.
The detection range setting processing unit 16B has a function of setting the user detection range of the user detected by the user detection processing unit 16C described below, and can set the user detection range in an arbitrary range. For example, a semicircular detection range having a predetermined radius or a rectangular detection range can be set in front of an elevator hall door in front of a car in the elevator hall.
The user detection processing unit 16C has a function of obtaining an image feature amount for detecting a person (user) from an image captured by the monitoring camera input to the image input processing unit 16A, and identifying and extracting the user from the feature amount. Alternatively, the image processing device has a function of identifying and extracting a user by comparing a model of a head image or an entire image of a person with a captured image.
In any case, the user detection processing unit 16C has a function of identifying and extracting a user from an image captured by the monitoring camera. The user detection processing unit 16C has a function of storing the detected discrimination information of the individual user, for example, the image feature amount of the individual user. This enables the user who moves to another floor to be identified and specified. Hereinafter, the image feature amount for identifying the user will be described as the identification information, but if the image feature amount is information for identifying the user other than the image feature amount, the image feature amount is processed as the identification information.
The ID setting processing unit 16D has a function of setting and giving ID information to all users detected by the user detection processing unit 16C. For example, if the ID information is associated with discrimination information on the face and body of the user, the ID information can be tracked by image analysis of the user. Further, the identification information of the individual user and the ID information may be associated by other methods. In order to ensure confidentiality, it is preferable to perform encryption processing on identification information, ID information, and the like of an individual user in advance.
Further, it is also possible to set a condition for releasing the associated user and ID. These conditions may be conditions in which the release condition is changed according to a value fixed when the system is stored or a communication request of various cooperating systems. In order to associate a user with an ID, it is necessary to store a characteristic point of the user in a computer, and therefore, in a usage scenario of the system, confidentiality is further considered.
The floor-specific user detection processing unit 16E detects movement information of each user on a floor-by-floor basis based on the identification information and ID information of the user. Here, the movement information of the user to be detected is, for example, identification information based on the user, the generation time and floor of the ID information, the number of the riding car, the riding time and floor of the user, the time and floor of the user getting off the elevator, and the like.
Such movement information can be acquired by setting and providing identification information and ID information based on image analysis of the user, and the movement information of the user can be accurately obtained by tracking the movement information by the monitoring cameras on the respective floors and the monitoring cameras in the car.
The movement information output processing unit 16F has a function of storing the movement information of the user for each floor obtained by the floor-specific user detection processing unit 16E in a separately provided rewritable storage area so that log output is possible.
The log output processing unit 16G has a function of outputting the log of the movement information of the user stored in the movement information output processing unit 16F. Therefore, the user movement information that is output as a log can be effectively used as parameter data for a preliminary simulation for performing group management control.
Next, details of the monitoring camera control system 16, which is a feature of the present embodiment, will be described with reference to fig. 2 to 11. Fig. 2 to 5 are diagrams for explaining a mode of consideration of the present embodiment, and fig. 6 to 11 are diagrams for explaining specific embodiments thereof.
Fig. 2 shows an elevator hall at a certain floor (e.g., 4 floors), and an elevator hall monitoring camera 17 is provided at an arbitrary position in the elevator hall. The elevator hall monitoring camera 17 uses a wide-angle camera capable of photographing the entire elevator hall. In addition, 4 elevators are installed in the elevator lobby, and the riding cars 4A to 4D are operated.
It goes without saying that the elevator hall monitoring cameras 17 are disposed also on floors other than the floor. The elevator hall monitoring cameras 17 on the respective floors are configured as network cameras, and are comprehensively managed and controlled by the monitoring camera control system 16, thereby enabling tracking of the user. Also, at the elevator lobby, the user waits before riding car 4B traveling in the upward direction and riding car 4D traveling in the downward direction. In the riding car 4B, 4 users Pdn in the downward direction are waiting, and in the riding car 4D, 6 users Pup in the upward direction are waiting.
Fig. 3 shows a user detection range in which image analysis for detecting a user who rides each car is performed. The user detection ranges 19A to 19D are set to be semicircular with a predetermined radius around the center of each of the elevator hall doors 18A to 18D before the elevator hall doors 18A to 18D of the riding car nos. 4A to 4D. Therefore, the user detection ranges 19A to 19D are set for each of the riding machines 4A to 4D.
At this time, since the user waits for the riding nos. 4B and 4D, the user existing in the user detection range 19B and the user detection range 19D can be detected by the image analysis. The setting of the user detection range can be arbitrarily set by the image input processing unit 16A and the detection range setting processing unit 16B, and can also be set to another rectangular shape different from the semicircular shape or any other shape.
For all users detected within the user detection range, image feature points as user identification information are extracted by image analysis, and ID numbers are set and assigned so as to be associated with the image feature points of the individual users. Therefore, if there is a user whose image feature points of the user captured by the other elevator hall monitoring cameras 17 match, the movement locus of the user can be estimated from the ID number.
Here, a state occurs in which it is impossible to determine which user detection range the user Pdn-1 located in the overlapping area where the adjacent user detection ranges 19A to 19D overlap. In this case, by image analysis, it is possible to extract characteristic points such as the face and shoulders of the user, and to specify the user detection range and the riding car thereof from the posture and direction in which the user faces. Alternatively, since the movement trajectory of the user can be estimated from the temporal change of the image, the user detection range and the riding car thereof can be specified by determining which riding car is moving.
Further, even in a state of waiting for the elevator, the position coordinates of the elevator can be set in advance, or the position coordinates can be automatically set by detecting the elevator door from the image captured from the elevator hall, and the position coordinates of the elevator can be determined by which elevator position coordinate the user is facing according to the detected posture and direction of the user, thereby determining which elevator waiting is facing.
Therefore, in fig. 3, the user Pdn-1 is detected as the user riding the car 5D in the user detection range 19D. The user detection range can be determined by the same processing in the overlap region of the other user detection ranges. Needless to say, an ID number associated with an image feature point as discrimination information may be set and assigned to a user existing in the overlap region.
In this state, it is possible to acquire movement information such as an image feature point, an ID number, a time at which the ID number is given, a riding floor, and a riding number as discrimination information of the individual user. Further, if the user disappears from the user detection range, the riding time information can be acquired as the time at which the user takes the corresponding riding car.
Fig. 4 shows a movement state of the user who moves downward on the riding car 4B shown in fig. 3. Elevator hall monitoring cameras 17-4 to 17-1 are installed on each floor, and image information of each elevator hall monitoring camera 17 is transmitted to a monitoring camera control system 16 to be comprehensively managed and controlled. Further, "n" of the reference number 17-n of the monitoring camera means a floor number. Similarly, in the riding car 4B, the car 21 is also provided with an in-car monitoring camera 20, and the image information thereof is also transmitted to the monitoring camera control system 16 to be comprehensively managed and controlled.
Next, the moving state of the user will be described with reference to fig. 5A to 5E, and fig. 5A to 5E show a case where the car 21 moves upward in a reverse manner to fig. 4.
First, as shown in fig. 5A, a user waiting at floor 5 waits for the arrival of the ride number 4B. In this case, the users are 4 users Pa to Pd. Therefore, at this time, the discrimination information of the individual users of the users Pa to Pd in the user detection range 19B is extracted by analyzing the image from the elevator hall monitoring camera 17-5. Further, the ID numbers are set and assigned so as to be associated with the discrimination information of the users Pa to Pd, and the setting times of the ID numbers are measured and stored. When the passenger car 4B arrives, the users Pa to Pd ride on the passenger car 4B as shown in fig. 5B. At this time, if the users Pa to Pd disappear from the user detection range 19B, it is determined that the users Pa to Pd have taken the riding car 4B, and the riding floors and riding time are measured and stored.
When the users Pa to Pd ride on the car 4B, images of the users Pa to Pd are transmitted to the monitoring camera control system 16 by the in-car monitoring camera 20 of the car 4B, and the discrimination information of the users Pa to Pd is obtained from the images and compared with the discrimination information of the users transmitted from the elevator hall monitoring camera 17-5. If the comparison result is the same, it is recognized that all the users Pa to Pd in the elevator hall have taken the riding number 4B. On the other hand, if the comparison results are different, a part of the users identified as inconsistent has moved using stairs or the like.
Next, when the riding car 4B ascends and stops at floor 7, as shown in fig. 5C, the user Pd gets off the elevator, but the image of the user Pd is transmitted to the monitoring camera control system 16 by the elevator hall monitoring camera 17-7, the discrimination information of the user Pd is obtained from the image, and the comparison is made with the discrimination information of the user Pd of the image transmitted from the elevator hall monitoring camera 17-5. If the comparison result is the same, the user Pd is identified to have been taken off the elevator at the 7 th floor, and the elevator taking-off time and the like are stored.
Further, since the user Pd also disappears from the image of the in-car monitoring camera 20 of the car 4B, it can be recognized that the user Pd has left the elevator at 7 floors. Therefore, the accuracy of the user's recognition can be improved by comparing the discrimination information of the elevator hall monitoring camera 17-7 with the discrimination information of the in-car monitoring camera 20 of the car. Needless to say, if the same processing is performed also on the following floors, the accuracy of recognition can be improved.
Next, when the riding car 4B ascends and stops at the 8 th floor, as shown in fig. 5D, the users Pa and Pb get out of the elevator, but the images of the users Pa and Pb are transmitted to the monitoring camera control system 16 by the elevator hall monitoring camera 17-8, and the discrimination information of the users Pa and Pb is obtained from the images and compared with the discrimination information of the users Pa and Pb of the images transmitted from the elevator hall monitoring camera 17-5. If the comparison result is the same, the users Pa and Pb have already got off the elevator at the 8 th floor, and the elevator getting-off floor and the elevator getting-off time are stored. In addition, since the users Pa and Pb also disappear from the image of the in-car monitoring camera 20 of the car 4B, it can be recognized that the users Pa and Pb have left the elevator at the 8 floors.
Next, when the riding car 4B ascends and stops at floor 9, the user Pc gets off the elevator as shown in fig. 5E, but an image of the user Pc is transmitted to the monitoring camera control system 16 by the elevator hall monitoring camera 17-9, and the discrimination information of the user Pc is obtained from the image and compared with the discrimination information of the user Pc of the image transmitted from the elevator hall monitoring camera 17-5. If the comparison result is the same, the user Pc is already at the 9 th floor and the elevator-leaving time and the like are stored. Further, since the user Pc disappears from the image of the in-car monitoring camera 20 of the car 4B, it can be recognized that the user Pc has left the elevator at the 9 th floor.
Next, when a downward user Pe newly appears in the 9-story elevator hall, the user Pe exists in the user detection range 19C as shown in fig. 5F, and the identification information of the user Pe is extracted, and the ID information is set and given, and the setting time of the ID number is measured and stored. Thereafter, the movement of the user Pe is tracked to acquire movement information.
Based on this thinking, specific embodiments will be described below. Fig. 6 is a flowchart showing a control flow of the computer of the above-described process executed in the monitoring camera control system 10. The control flow is started at the start timing per given time.
Step S10
In step S10, a user detection range setting process is executed, and user detection ranges 19A to 19D as shown in fig. 3 are set based on the input image of the elevator hall. Then, in the following step S11, the extraction of the user for each riding car is executed from the images in the user detection ranges 19A to 19D. If the setting process of the user detection range is completed, the process proceeds to step S11.
Step S11
In step S11, a user detection process is executed to analyze the images in the user detection ranges 19A to 19D set in step S10, and to extract individual users for each riding car. For the extraction of the individual user, an image feature amount for extracting the user is obtained from the image captured by the elevator hall monitoring camera 17, and the individual user is specified and extracted based on the feature amount. The feature value is used as the discrimination information of the individual user, and is stored in a storage area not shown.
Here, when analyzing images in the user detection ranges 19A to 19D, if there is a user in an overlapping area where the user detection ranges overlap, there may be a case where it is not possible to determine which user detection range the user is set for. Therefore, when the user detection process is finished, the process proceeds to step S12, and the user setting process in the overlap area is executed.
Step S12
In step S12, a process called human vector detection herein is executed to reset the user detection range of the user existing in the overlap area. This process is a process of determining which user detection range the user Pdn-1 (see fig. 3) existing in the overlap area belongs to and resetting, and is capable of extracting characteristic points such as the face and shoulder of the user by image analysis, and specifying the user detection range and the riding size thereof from the direction and posture in which the user faces. In addition, since the movement locus of the user Pdn-1 can be estimated from the temporal change of the image, the user detection range and the riding car thereof can be specified by determining which riding car is moving. In this way, the user detection range of the user existing in the overlap region can be set based on at least one or more information of the posture, direction, or movement trajectory of the user. The human vector detection process is described in detail with reference to fig. 7. When the human vector detection process ends, the process proceeds to step S13.
Step S13
In step S13, ID numbers are set and assigned to the individual discrimination information of all the users extracted in steps S11 and S12. If the identification information including the feature values of the face and body of the user is associated with the ID information in advance, the ID information of the moving user can be tracked by the image analysis of the user. In step S13, an ID number associated with the identification information of the individual user is stored.
For example, as shown in fig. 4, the ID number "0001" can be set for user Pa, the ID number "0002" can be set for user Pb, the ID number "0003" can be set for user Pc, and the ID number "0004" can be set for user Pd. Therefore, for example, when the user Pa moves from a certain departure floor to a different arrival floor, since the discrimination information of the user Pa is acquired at the departure floor, if the same discrimination information can be acquired by the image analysis of the user Pa at the arrival floor, the user Pa can follow the same by the ID number "0001". This ID assignment process is described in detail in fig. 8.
Further, the identification information of the individual user and the ID information may be associated by other methods. In order to ensure confidentiality, it is preferable to perform encryption processing on identification information, ID information, and the like of an individual user in advance. When the ID number assignment process is completed, the process proceeds to step S14.
Step S14
In step S14, the time (generation time) at which the individual user arrives at the user detection range of the elevator hall, and the ID number is set and assigned is stored. Therefore, the generation timing differs for each ID number, and for example, the departure time distribution of the user on each floor can be acquired.
Step S15
In step S15, it is detected whether or not the user has arrived at the riding car. As this detection method, detection can be performed based on the disappearance of the user from the image. That is, if the user disappears from the user detection range, it is determined that the user is riding the car. In this case, of course, since the identification information and the ID information of the user are also acquired, the riding information of the individual user can be obtained.
Further, the riding information of the user can be obtained from the image of the in-car monitoring camera 20 provided in the car 21 without making a determination based on the image of the elevator hall monitoring camera 17. As described above, since the identification information and the ID number of the user by the elevator hall monitoring camera 17 are acquired, it is possible to determine whether or not the user waiting in the elevator hall is seated by comparing the identification information with the identification information of the user based on the image of the in-car monitoring camera 20. When the seated user detection process is completed, the process proceeds to step S16.
Step S16
In step S16, the time when the user has taken the car (riding time) is stored. In this case, since the identification information and the ID number of the user are also acquired, the time at which the user rides can be acquired from the riding information detected for each user as shown in step S15. When the seating time is acquired, the process proceeds to step S17.
Step S17
In step S17, when the riding car taken by the user reaches the destination elevator landing floor, the elevator landing information of the passenger is detected. The user's detection of the elevator descent can be detected by the elevator hall monitoring camera 17 arriving at the floor. That is, although the user gets off the elevator at the arrival floor, the image of the user is transmitted to the monitoring camera control system 16 by the elevator hall monitoring camera 17 at the arrival floor. The monitoring camera control system 16 obtains the discrimination information of the user who has got off the elevator from the transmitted image, and compares the discrimination information with the discrimination information of the user of the image transmitted from the elevator hall monitoring camera 17 at the departure floor. If the comparison result is the same, it is recognized that the user who is seated on the departure floor has already left the elevator, and the ID number, the elevator-leaving floor, and the like are stored. Further, since the user also disappears from the image of the in-car monitoring camera 20, it can be estimated that the user has left the elevator. If the detection process of the user who gets off the elevator is completed, the process proceeds to step S18.
Step S18
In step S18, the time when the user has got off the elevator from the riding car (elevator getting-off time) is stored. In this case, since the identification information and the ID number of the user are also acquired, the time when the elevator is taken by each user can be acquired from the information of taking the elevator detected by each user as shown in step S17. If the elevator-off timing is acquired, the process proceeds to step S19.
Step S19
In step S19, the movement information of the user is stored in a rewritable storage area for log so that the movement information (movement history) of the individual user acquired by executing the above-described control steps can be log-outputted. In this case, "identification information of the user," ID number "associated therewith," generation time (set time) "of the ID number," generation floor number "generated by the ID number," riding time "and" riding floor number "of the riding car, and" getting-off time "and" getting-off floor number "of the riding car having been taken off from the riding car are stored. In addition, since the information of the user detection range and the riding number is stored, it can be stored in the storage area for log as needed. When the user' S movement information is stored in the rewritable log storage area, the process proceeds to step S20.
Step S20
In step S20, the ID numbers of the individual users and the discrimination information associated therewith are released (erased), and only the substantial movement information of the users is retained as a history. When step S20 is ended, the control flow of fig. 6 is ended, and the next start timing is waited for. In this way, since the movement information of the user can be accurately grasped, the accuracy of the preliminary simulation for performing the group management control can be improved. The ID release processing is described in detail in fig. 9.
Next, a specific example of step S12 shown in fig. 6 will be described with reference to fig. 7. If step S11 shown in fig. 6 is completed, step S30 to step S34 shown in fig. 7 are executed, and the process moves to step S13 again.
Step S30
In step S30, it is determined whether or not a user is newly detected in the elevator hall based on the image of the camera. Since each user can be identified by image analysis, if a new user is present in the elevator hall with respect to the user before the present time, the new user can be identified by extracting the new user. In step S30, if no new user is extracted, the process ends, and if a new user is extracted, the process proceeds to step S31.
Step S31
In step S31, as shown in fig. 3, it is determined whether or not a new user Pdn-1 exists in an overlapping area between the user detection range 19D and the user detection range 19C. Then, if it is determined that the new user Pdn-1 is present in the overlap area, the process proceeds to step S32, and if it is determined that the new user Pdn-1 is not present in the overlap area, the process proceeds to step S33.
Step S32
In step S32, the occupant detection range to which the new occupant belongs is determined by the human vector processing shown in fig. 6 (step S12), and the riding car is also determined. This is numerically converted into an angle based on an arbitrary direction with respect to the detected position coordinates of the user and the direction in which the user is facing. For example, in the case of the user Pdn-1 in fig. 3, the layout is set to 0 ° with reference to an angle vertically upward in the drawing when viewed from above the elevator hall plane. From there, which elevator is facing is detected based on the position coordinates of the elevator and the position coordinates of the user.
The following may be used: for example, simply, the position area of the elevator is set in the case of "waiting for machine No. 1 in the case of 1 ° to 90 °, waiting for machine No. 2 in the case of 91 ° to 180 °, waiting for machine No. 3 in the case of 181 ° to 270 °, and waiting for machine No. 4 in the case of 271 ° to 0 °, and the change in the angle at which each machine waits is detected based on the position coordinates of the user. In the example of fig. 3, the direction of the user is detected, and if the user of the user Pdn-1 is 120 °, it is determined that the No. 2 airplane is on standby.
In this way, in the re-setting of the user detection range, the waiting direction of each riding car can be set, the waiting direction of the user in the overlap area can be detected from the image of the monitoring camera, and the user detection range of the user existing in the overlap area can be set by comparing the waiting direction with the waiting direction of the riding car.
Step S33
Since it is determined in step S31 that the new user clearly belongs to the user detection range, the riding car is determined based on the user detection range in step S33.
Step S34
In step S34, the ride number is assigned to the user in the overlapping area and the user extracted in the user detection range. On the other hand, when a new user is present, and is not present in the overlap area but is present in the user detection range, the ride number is assigned to the user extracted in the user detection range. When the processing is completed, the process proceeds to step S13 in fig. 6, and the operation of the control flow shown in fig. 6 is continued.
Next, a specific example of step S13 shown in fig. 6 will be described with reference to fig. 8. When step S12 shown in fig. 6 is completed, step S40 to step S41 shown in fig. 8 are executed, and the process again proceeds to step S14.
Step S40
In step S40, it is determined whether or not the determination information of the user existing in the user detection range is detected for the first time. In this case, since the discrimination information of the user existing in the user detection range before the current time has already been stored in the storage area of the monitoring camera control system 16, it is possible to discriminate whether or not the user is the first detected by comparing the stored discrimination information with the discrimination information detected this time.
If the discrimination information is not the first detected discrimination information, the process exits to the end, and if the discrimination information is the first detected discrimination information, the process proceeds to step S41.
Step S41
In step S41, the ID number is newly assigned to the discrimination information of the user that is detected for the first time. Further, since the identification information of the user and the ID number are stored in association with each other, the moving state of the user to which the ID number is given can be tracked. When the processing is completed, the process proceeds to step S14 in fig. 6, and the operation of the control flow shown in fig. 6 is continued.
Next, a specific example of step S20 shown in fig. 6 will be described with reference to fig. 9. When step S19 shown in fig. 6 is completed, step S50 to step S51 shown in fig. 9 are executed, and the process ends.
Step S50
In step S50, a determination is made as to whether the release processing of the ID number is required. In the present embodiment, the case where the movement information (movement history) of the user is tracked at a predetermined time or the case where it is not, is set, and the setting is performed in units of "1 day" as an example thereof. If the release condition is set to "1 day unit", the process proceeds to step S51, and if not, the process proceeds to step S52.
Step S51
In step S51, the discrimination information (feature amount) and the ID number are not released but associated and held until the date and time become 0: 00: 00 (c). This enables tracking of the movement state of the user over 1 day.
Step S52
In step S52, if a predetermined release condition is satisfied in a period of time less than "1 day unit", the ID number assigned to the individual user and the discrimination information associated therewith are released (canceled). As described above, what is released is the ID number given to the individual user and the discrimination information associated therewith, and the "generation time (set time)" of the ID number, the "generation floor number" of the ID number, the "riding time" and "riding floor number" of the riding car, the "getting-off time" and "getting-off floor number" of the riding car, and the like other than the ID number are held as simulated parameter data.
Here, as the release conditions set in step S52, it is possible to consider (1) the user having got on the car as a release condition, (2) the expiration of the time of the week or the like in the traffic calculation used in the elevator installation plan as a release condition, (3) the final arrival of the elevator at the time of transfer from the first elevator group to the second elevator group or from the second elevator group to the first elevator group as a release condition (for example, the arrival of the elevator at the lower floor as a release condition in the case of transfer from the lower floor to the upper floor) when transfer is assumed, in the case of transfer between the first elevator group and the second elevator group, and the like, and therefore, appropriate release conditions may be set.
Fig. 10 is a diagram showing a specific example of the log statistic processing executed in step S19, and is a log output when the passenger gets out of the elevator as the ID release condition.
The log output stores "user identification information" indicating the user, "ID number" associated with the user, "generation time" of the ID number, "generation floor number" in which the ID number is generated, "riding time" and "riding floor number" in which the riding car has been riding, and "elevator getting-off time" and "elevator getting-off floor number" in which the elevator has been getting-off from the riding car.
For example, with respect to the user Pa, the ID number is set to "0001", and the ID number is set to "8: 00: 01 ", and an elevator hall where the user Pa is at" 5 floors "is" 8: 01: 05 "ride car, and further, the elevator hall where the user Pa is at" floor 8 "is" 8: 01: 20' get off the elevator from the riding car. Similarly, for other users, the accuracy of the preliminary simulation for performing group management control based on the movement information of these users can be improved.
When the ID release process is executed, the user identification information and the "ID number" associated therewith are eliminated and replaced with general user information (e.g., a, B, c.. or the like). Since the simulation does not require the personal information of the user, there is no problem even if the personal information such as the identification information of the user is deleted, and it is preferable from the viewpoint of protection of the personal information.
Fig. 11 is a diagram showing a specific example of the log statistic processing executed in step S19, and is a log output when "1 day unit" is used as the ID release condition. The log output is basically the same as fig. 10, but since the ID release condition is "1 day unit", the movement information is acquired in "1 day unit" as shown by the user Pc.
For example, in the first detection process, the ID number of the user Pc is set to "0003", and the ID number is set to "8: 00: 04 ", and the elevator hall where the user Pc is at" floor 5 "can be obtained as" 8: 01: 06 "ride car, and further, user Pc rides on the elevator hall at" floor 9 "from the elevator hall at" floor 8: 01: 55' go down the elevator to the office.
Further, even after the user Pc has worked at 9 offices and the time has elapsed, the ID number of the user Pc remains "0003", which is detected again as "10: 10: 25 ", and the elevator hall where the user Pc is at" 9 floors "is in" 10: 10: 36 "ride on car, and further, the user Pc gets on the elevator hall at" floor 1 "from the elevator hall at" 10: 11: 03 "elevator off. Similarly, it is possible to improve the accuracy of the preliminary simulation for performing the group management control based on the movement information of the other users. Further, by setting the release condition of the ID, data for maintaining the accuracy of the simulation can be created even in any usage scenario in consideration of privacy.
In the present embodiment, the "user identification information" indicating the user, the "ID number" associated therewith, the "generation time" of the ID number, the "generation floor number" at which the ID number is generated, the "riding time" and the "riding floor number" at which the riding car has been riding, and the "getting-off time" and the "getting-off floor number" at which the riding car has been taken off from the riding car are stored, but other storage items may be set as necessary.
The above-described embodiment shows an example in which the images of the elevator hall monitoring cameras 17 provided in the elevator halls of the respective floors and the images of the in-car monitoring cameras 20 provided in the cars are comprehensively managed and controlled by the monitoring camera control system 16, but the movement information of the user may be acquired by using only the elevator hall monitoring cameras 17 provided in the elevator halls of the respective floors, or the movement information of the user may be acquired by using only the car monitoring cameras 20 provided in the cars.
When the movement information of the user is acquired using only the hall monitoring cameras 17 provided in the elevator lobbies of the respective floors, the movement information can be basically acquired by the same control as the control flow shown in fig. 5. In this case, the riding user detection processing of step S15 and the getting-off user detection processing of step S17 can be performed by detecting the behavior of the user from the image of the riding elevator hall monitoring camera 17 and the image of the getting-off elevator hall monitoring camera 17. That is, the user can detect that the user is absent from the user detection range when the user is riding, and can detect that the user is present in the user detection range when the user is getting off the elevator.
When the movement information of the user is acquired only by using the in-car monitoring camera 20 provided in the car, the identification information of the individual user detected by the in-car monitoring camera 20 is associated with the ID number, and the appearance time when the user appears in the car and the disappearance time when the user disappears from the car are stored for each ID number. Further, the departure floor and the arrival floor can be associated with each ID number based on the riding number information and the time information, and log output can be acquired.
As described above, the present invention is configured as follows: the movement information of a user is grasped by setting and giving discrimination information and ID information for discriminating the user to each user based on an image of an elevator hall camera for photographing an elevator hall of an elevator or an image of a camera in a car, tracking the movement of the user based on the discrimination information and the ID information, and storing at least the riding floor and the boarding floor of the user in association with each other.
This makes it possible to accurately grasp the movement information of the user, and thus, for example, accurate parameter data necessary for a preliminary simulation for performing group management control can be obtained, and therefore, the accuracy of the simulation can be improved.
The present invention is not limited to the above-described embodiments, and includes various modifications. For example, the above-described embodiments are the ones explained in detail for easily understanding the present invention, and are not limited to the ones having all the configurations explained. One element of the structure of one embodiment may be replaced with the structure of another embodiment, and the structure of another embodiment may be added to the structure of one embodiment. Further, one unit of the configuration of each embodiment can be added, deleted, or replaced with another configuration.
Description of the reference numerals
An elevator operation management system, 10a.. a learning section, 10b.. a receiving section, 10c.. a number-of-people-per-floor evaluation section, 10d.. a comprehensive evaluation section, 10e.. an allocation instruction section, 11A to 11n.. a riding elevator control system, 12.. a communication network, 16.. a monitoring camera control system, 16a.. an image input processing section, 16b.. a detection range setting processing section, 16c.. a user detection processing section, 16d.. an ID setting processing section, 16e.. a user detection processing section, 16f.. a movement information output processing section, 16g.. a log output processing section, 17.. an elevator hall monitoring camera, 18A to 18d.. an elevator hall door, 19A to 19d.. a user detection range, 20.. an in-car monitoring camera.

Claims (4)

1. An elevator use log output system for analyzing images of an elevator hall camera of an elevator hall where a plurality of elevators take a number and grasping movement information of a user of the elevator hall,
the elevator use log output system is provided with:
a user detection unit for detecting the discrimination information for discriminating the individual user according to the image of the elevator hall camera;
an ID number setting unit for setting and assigning an individual ID number corresponding to the discrimination information;
a movement information detection unit that detects movement information of an individual user based on the discrimination information and the ID number; and
a storage unit that stores the movement information detected by the movement information detection unit in correspondence with the ID number,
the movement information detection means is provided with user identification means for detecting the movement information of the individual user identified by the user identification means, and the user identification means compares the discrimination information of the individual user who takes a car from the elevator lobby with the discrimination information of the individual user who takes a car from the elevator lobby and the discrimination information of the individual user who gets off the elevator from the other elevator lobby, and specifies the individual user whose discrimination information at the time of taking and the discrimination information at the time of getting off the elevator coincide with each other,
the elevator use log output system is provided with a user detection range setting unit which sets a user detection range for extracting a user for each riding number of the elevator lobby from an image of the elevator lobby camera,
the user detection unit extracts the discrimination information of an individual user from the user detection range of each of the ride vehicles set by the user detection range setting unit,
the elevator use log output system is provided with a user detection range resetting unit which analyzes images of users in an overlapping area and sets the user detection range of the user in the overlapping area when the user is in the overlapping area of adjacent user detection ranges set by the user detection range setting unit,
the user detection range resetting unit sets a waiting direction of each riding car, detects a waiting direction of a user in the overlap area according to an image of the elevator hall camera, compares the waiting direction with the waiting direction of the riding car, and sets the user detection range of the user in the overlap area.
2. The elevator usage log output system of claim 1,
the elevator use log output system is provided with an ID number release means, and when the movement information is stored in the storage means, the ID number release means releases the ID number of the individual user and the discrimination information associated therewith according to a predetermined release condition.
3. An elevator use log output method for analyzing images of an elevator hall camera of an elevator hall where a plurality of elevators take a number and grasping movement information of a user of the elevator hall,
in the elevator use log outputting method,
detecting discrimination information for discriminating individual users based on the image of the elevator hall camera,
setting an individual ID number corresponding to the discrimination information,
detecting movement information of the individual user based on the discrimination information and the ID number,
storing the detected movement information corresponding to the ID number,
in the case where the movement information is detected,
comparing the discrimination information of an individual user who takes a car from the elevator lobby with the discrimination information of an individual user who takes an elevator from the elevator lobby and gets out of the elevator from the other elevator lobby, identifying an individual user whose discrimination information at the time of taking and the discrimination information at the time of getting out of the elevator coincide with each other, and detecting the movement information,
in the elevator use log outputting method,
setting a user detection range in which a user is extracted for each riding number of the elevator lobby from the image of the elevator lobby camera,
extracting the discrimination information of an individual user from the user detection range of each of the set ride numbers,
analyzing an image of a user in an overlapping area of the user detection areas adjacent to each other and resetting the user detection area of the user in the overlapping area,
in the case of resetting the user detection range,
setting a waiting direction of each riding car, and detecting a waiting direction of a user in the overlapping area based on an image of the elevator hall camera,
and comparing the waiting direction with the waiting direction of the riding number machine, and setting the user detection range of the user in the repeated area.
4. The elevator usage log output method according to claim 3,
in the elevator use log outputting method,
if the movement information is stored, the ID numbers of the individual users and the discrimination information associated therewith are released according to a given release condition.
CN201780095835.1A 2017-10-30 2017-10-30 Elevator use log output system and elevator use log output method Active CN111212802B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/039131 WO2019087251A1 (en) 2017-10-30 2017-10-30 Elevator usage log output system, and elevator usage log output method

Publications (2)

Publication Number Publication Date
CN111212802A CN111212802A (en) 2020-05-29
CN111212802B true CN111212802B (en) 2021-06-29

Family

ID=66331617

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780095835.1A Active CN111212802B (en) 2017-10-30 2017-10-30 Elevator use log output system and elevator use log output method

Country Status (3)

Country Link
JP (1) JP7005648B2 (en)
CN (1) CN111212802B (en)
WO (1) WO2019087251A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020261560A1 (en) * 2019-06-28 2020-12-30 三菱電機株式会社 Building management system
JP7398913B2 (en) 2019-09-24 2023-12-15 株式会社電通国際情報サービス Mobile movement management system
JP2021066575A (en) * 2019-10-25 2021-04-30 株式会社日立製作所 Elevator system
JP7199341B2 (en) * 2019-12-26 2023-01-05 株式会社日立製作所 BUILDING MODEL DATA SUPPORT SYSTEM AND BUILDING MODEL DATA SUPPORT METHOD
CN111612814A (en) * 2020-02-04 2020-09-01 北京旷视科技有限公司 Method, device and electronic system for identifying and tracking heat-generating personnel

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003276963A (en) * 2003-02-03 2003-10-02 Toshiba Corp Elevator controller by use of image monitoring device
JP2007186300A (en) * 2006-01-13 2007-07-26 Toshiba Elevator Co Ltd Entrance lock interlocking operation elevator control system
CN103221984A (en) * 2010-11-19 2013-07-24 株式会社尼康 Guidance system, detection device, and position assessment device
CN103287939A (en) * 2012-02-24 2013-09-11 东芝电梯株式会社 Apparatus for measuring number of people in elevator, elevator having the apparatus, and elevator system including a plurality of elevators with the apparatus
JP2014219913A (en) * 2013-05-10 2014-11-20 技研トラステム株式会社 Apparatus for counting number of getting on/off passengers
CN104787635A (en) * 2015-04-24 2015-07-22 宁夏电通物联网科技有限公司 Elevator floor data collecting device and elevator floor operation monitoring and controlling system and method
CN105173930A (en) * 2015-06-17 2015-12-23 厦门乃尔电子有限公司 Intelligent elevator system based on mobile terminal and intelligent elevator riding method
WO2016087557A1 (en) * 2014-12-03 2016-06-09 Inventio Ag System and method for alternatively interacting with elevators
CN106553941A (en) * 2015-09-30 2017-04-05 腾讯科技(深圳)有限公司 A kind of intelligent elevator group control method and system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09278301A (en) * 1996-04-11 1997-10-28 Mitsubishi Electric Corp Operation control device for elevator
JP3743403B2 (en) 2002-07-15 2006-02-08 株式会社日立製作所 Security device
FI121421B (en) * 2009-07-28 2010-11-15 Marimils Oy A system for controlling lifts in an elevator system
JP5974423B2 (en) * 2010-11-19 2016-08-23 株式会社ニコン Guidance device
CN203794388U (en) * 2013-12-02 2014-08-27 山东省射频识别应用工程技术研究中心有限公司 Elevator operation monitoring and pre-warning system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003276963A (en) * 2003-02-03 2003-10-02 Toshiba Corp Elevator controller by use of image monitoring device
JP2007186300A (en) * 2006-01-13 2007-07-26 Toshiba Elevator Co Ltd Entrance lock interlocking operation elevator control system
CN103221984A (en) * 2010-11-19 2013-07-24 株式会社尼康 Guidance system, detection device, and position assessment device
CN103287939A (en) * 2012-02-24 2013-09-11 东芝电梯株式会社 Apparatus for measuring number of people in elevator, elevator having the apparatus, and elevator system including a plurality of elevators with the apparatus
JP2014219913A (en) * 2013-05-10 2014-11-20 技研トラステム株式会社 Apparatus for counting number of getting on/off passengers
WO2016087557A1 (en) * 2014-12-03 2016-06-09 Inventio Ag System and method for alternatively interacting with elevators
CN104787635A (en) * 2015-04-24 2015-07-22 宁夏电通物联网科技有限公司 Elevator floor data collecting device and elevator floor operation monitoring and controlling system and method
CN105173930A (en) * 2015-06-17 2015-12-23 厦门乃尔电子有限公司 Intelligent elevator system based on mobile terminal and intelligent elevator riding method
CN106553941A (en) * 2015-09-30 2017-04-05 腾讯科技(深圳)有限公司 A kind of intelligent elevator group control method and system

Also Published As

Publication number Publication date
WO2019087251A1 (en) 2019-05-09
JP7005648B2 (en) 2022-01-21
JPWO2019087251A1 (en) 2020-10-22
CN111212802A (en) 2020-05-29

Similar Documents

Publication Publication Date Title
CN111212802B (en) Elevator use log output system and elevator use log output method
CN109292579B (en) Elevator system, image recognition method and operation control method
EP3424856B1 (en) Elevator control apparatus and elevator control method
JP5865729B2 (en) Elevator system
CN101506077B (en) Anonymous passenger indexing system for security tracking in destination entry dispatching operations
EP3041775B1 (en) Elevator dispatch using facial recognition
CN107010500B (en) Elevator group management control device, group management system, and elevator system
CN106915672B (en) Elevator group management control device, group management system, and elevator system
EP2316770A1 (en) Elevator control device
JPWO2006043324A1 (en) Elevator control device
EP2300949A1 (en) Video-based system and method of elevator door detection
JP6230472B2 (en) Group management elevator equipment
CN111225866B (en) Automatic call registration system and automatic call registration method
EP3889090B1 (en) Inferred elevator car assignments based on proximity of potential passengers
CN105270938B (en) Elevator device
CN107000960B (en) Evacuation controller
JP7333773B2 (en) Elevator system and operation control method for elevator device
JP5596423B2 (en) Elevator control system
JP6483559B2 (en) Group management elevator equipment
CN104671020A (en) Elevator system
KR102084497B1 (en) Elevator passenger emergency rescue apparatus
JP7327560B1 (en) elevator group control system
JP2021038028A (en) Detecting system of number of people moving up/down in elevator and detecting method of number of people moving up/down in elevator
RU2447008C2 (en) Method and system of controlling elevators, method of anonymous observation of passengers
WO2021260803A1 (en) Server device, system, method for controlling server device, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant