CN108622778B - Elevator system - Google Patents

Elevator system Download PDF

Info

Publication number
CN108622778B
CN108622778B CN201711460846.3A CN201711460846A CN108622778B CN 108622778 B CN108622778 B CN 108622778B CN 201711460846 A CN201711460846 A CN 201711460846A CN 108622778 B CN108622778 B CN 108622778B
Authority
CN
China
Prior art keywords
hall
corner
edge
pattern information
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711460846.3A
Other languages
Chinese (zh)
Other versions
CN108622778A (en
Inventor
横井谦太朗
野田周平
田村聪
木村纱由美
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Elevator and Building Systems Corp
Original Assignee
Toshiba Elevator Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Elevator Co Ltd filed Critical Toshiba Elevator Co Ltd
Publication of CN108622778A publication Critical patent/CN108622778A/en
Application granted granted Critical
Publication of CN108622778B publication Critical patent/CN108622778B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B13/00Doors, gates, or other apparatus controlling access to, or exit from, cages or lift well landings
    • B66B13/24Safety devices in passenger lifts, not otherwise provided for, for preventing trapping of passengers
    • B66B13/26Safety devices in passenger lifts, not otherwise provided for, for preventing trapping of passengers between closing doors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B5/00Applications of checking, fault-correcting, or safety devices in elevators
    • B66B5/0006Monitoring devices or performance analysers
    • B66B5/0012Devices monitoring the users of the elevator system

Abstract

The present invention relates to an elevator system capable of setting a detection area suitable for a current stop floor even when floor information of an elevator is not obtained for a building having a hall with a different specification in each floor. An elevator system of an embodiment comprises: an imaging part which is arranged in the passenger car and can shoot a specified range from the vicinity of a door in the passenger car to the direction of the elevator waiting hall; a storage unit that stores pattern information of different user detection areas for each shape of a hall; a setting unit that compares the captured image captured by the imaging unit with the stored pattern information, selects pattern information corresponding to the shape of the hall represented by the captured image, and sets a user detection area indicated by the pattern information; a detection unit that detects a user in the user detection area set by the setting unit; and a control unit for controlling the opening and closing of the door based on the detection result of the detection unit.

Description

Elevator system
The present application is based on Japanese patent application 2017 and 058764 (application date: 3/24/2017) and enjoys priority based on the application. This application is incorporated by reference into this application in its entirety.
Technical Field
Embodiments of the present invention relate to an elevator system.
Background
In recent years, various techniques have been proposed to prevent people and objects from being caught by elevator car doors. For example, the following techniques are proposed: a camera is used to photograph the vicinity of a car door, and the opening and closing of an elevator door are controlled according to the presence or absence of a user in a detection area set in the vicinity of the car door.
In such a technique, the type (shape, size) of the detection area in which the user can be detected is fixed in advance for each building, and is often used commonly for all floors. However, depending on the building, the specifications of the hall may differ from floor to floor, and the pattern of the detection area may have to be changed for each floor. For this, there are the following methods: the detection area is defined in advance for each floor, and is changed to the detection area corresponding to the current stop floor of the elevator each time.
Disclosure of Invention
However, in the above method, when floor information indicating the current stop floor of the elevator is not obtained, there is an inconvenience that the detection area cannot be changed to the detection area corresponding to the stop floor, that is, the detection area suitable for the stop floor cannot be set.
The present invention has been made to solve the above-described problems, and an object of the present invention is to provide an elevator system that can set a detection area suitable for a current stop floor even in a case where floor information of an elevator is not available in a building having a different hall of a building.
An elevator system according to an embodiment includes: an imaging unit which is provided in a car and can image a predetermined range in a direction from the vicinity of a door in the car to a waiting hall; a storage unit that stores pattern information of different user detection areas for each shape of a hall; a setting unit that compares the captured image captured by the imaging unit with each of the stored pattern information, selects pattern information corresponding to the shape of the lobby presented by the captured image, and sets a user detection area indicated by the pattern information; a detection unit that detects a user in the user detection area set by the setting unit; and a control unit that controls opening and closing of the door based on a detection result of the detection unit.
According to the elevator system configured as described above, even in a building in which the specifications of the hall are different among the floors, it is possible to set a detection area suitable for the current stop floor even when the floor information of the elevator is not obtained.
Drawings
Fig. 1 is a diagram showing a schematic configuration example of an elevator system according to an embodiment.
Fig. 2 is a schematic diagram showing an example of the area setting information according to the embodiment.
Fig. 3 is another schematic diagram showing an example of the area setting information according to the embodiment.
Fig. 4 is a diagram for explaining the triple frame.
Fig. 5 is a flowchart showing an example of the operation of the image processing apparatus according to this embodiment.
Fig. 6 is a diagram showing an example of the extraction result by the edge/corner extraction unit according to this embodiment.
Fig. 7 is a schematic diagram showing an example of the matching result by the edge/corner matching section according to this embodiment.
Fig. 8 is another diagram showing an example of the matching result by the edge/corner matching section according to this embodiment.
Fig. 9 is still another schematic diagram showing an example of the area setting information according to the embodiment.
Fig. 10 is another diagram showing an example of the extraction result by the edge/corner extraction unit according to this embodiment.
Detailed Description
The following describes embodiments with reference to the drawings.
Fig. 1 is a diagram showing a schematic configuration example of an elevator system according to an embodiment. Note that, although 1 car is described as an example here, the same configuration is also used in the case of a plurality of cars.
A camera 12 (may also be referred to as an "imaging unit") is provided above an entrance and an exit of the car 11. Specifically, a lens portion of the camera 12 is provided toward the waiting hall 15 in the barrier 11a covering the upper portion of the doorway of the car 11. The camera 12 is a small-sized monitoring camera such as a vehicle-mounted camera, has a wide-angle lens, and can continuously capture images of several frames (for example, 30 frames/second) within 1 second.
The camera 12 may be always powered on to perform shooting, or may be powered on (activated) at a predetermined timing to start shooting, and powered off at a predetermined timing to end shooting.
Specifically, the camera 12 may be turned on when the moving speed of the car 11 is lower than a predetermined value, and may be turned off when the moving speed of the car 11 is equal to or higher than a predetermined value. In this case, when the car 11 starts decelerating to stop at a predetermined floor and the moving speed is lower than a predetermined value, the camera 12 is turned on and starts shooting. On the other hand, when the car 11 starts accelerating to move to a predetermined floor and the moving speed becomes equal to or higher than a predetermined value, the camera 12 is turned off and the shooting is ended.
That is, the imaging by the camera 12 is continuously performed from the time when the deceleration is started and the moving speed is lower than the predetermined value to the time when the car 11 stops at the predetermined floor, including the time when the car 11 stops at the predetermined floor, to the time when the acceleration is started and the moving speed is higher than or equal to the predetermined value to the time when the car 11 moves from the predetermined floor to another floor.
By switching the power on/off of the camera 12 in this way, a power saving effect can be obtained as compared with a case where the power is always on.
The shooting range of the camera 12 is set to L1+ L2(L1> > L2). L1 is the imaging range on the hall 15 side, and is 3m from the car door 13 to the hall 15, for example. L2 is the imaging range on the car 11 side, and is 50cm from the car door 13 to the car back surface, for example. L1 and L2 are depth direction ranges, and the range in the width direction (direction orthogonal to the depth direction) is set to be at least larger than the lateral width of the car 11.
In the hall 15 at each floor, a hall door 14 is openably and closably provided at an arrival entrance of the car 11. The hoistway doors 14 are engaged with the car doors 13 to open and close when the car 11 arrives. The power source (door motor) is located on the car 11 side, and the hoistway doors 14 are opened and closed only following the car doors 13. In the following description, the hoistway doors 14 are opened when the car doors 13 are opened, and the hoistway doors 14 are closed when the car doors 13 are closed.
Each image (video) continuously captured by the camera 12 is analyzed in real time by the image processing device 20. Specifically, the image processing device 20 detects (the movement of) the user closest to the car door 13 based on the change in the brightness value of the image, determines whether the detected user has an intention to get into the car 11, determines whether there is a risk that the detected user's hand or arm is pulled into the door pocket, or the like. The result of the analysis processing performed by the image processing device 20 is reflected in the control processing (mainly, door opening/closing control processing) performed by the elevator control device 30 as necessary.
The elevator control device 30 controls the opening and closing of the doors of the car doors 13 when the car 11 arrives at the waiting hall 15. Specifically, when the car 11 arrives at the waiting hall 15, the elevator control device 30 opens the car doors 13 and closes the doors after a predetermined time has elapsed.
However, when the image processing device 20 detects a user having an intention to ride on the car 11, the elevator control device 30 prohibits the door closing operation of the car doors 13 and maintains the door-opened state (in other words, extends the door-open time). When the image processing apparatus 20 detects a user who is at risk of having his or her hand or arm drawn into the door pocket, the elevator control apparatus 30 prohibits the door opening operation of the car door 13. Alternatively, the elevator control device 30 slows down the door opening speed of the car door 13. Alternatively, the elevator control device 30 warns the user to move away from the car door 13.
Note that, although the image processing device 20 is shown in fig. 1 as being removed from the car 11 for convenience of explanation, the image processing device 20 is actually housed in the baffle 11a together with the camera 12.
The image processing apparatus 20 includes a storage unit 21, a user detection unit 22, an area setting information storage unit 23, an area setting selection unit 24, and the like. The storage unit 21 sequentially stores images (captured images) captured by the camera 12, and has a buffer area for temporarily storing data necessary for processing by the user detection unit 22.
The user detection unit 22 detects (motion of) a user closest to the car door 13 in the user detection region based on a change in the luminance value of the captured image. The user detection unit 22 determines whether the detected user has an intention to get on the car 11, whether the detected user has a risk that the hand or arm of the user is pulled into the door pocket, and the like. These determination results by the user detection unit 22 are reflected in the door opening/closing control of the car door 13 by the elevator control device 30 as necessary.
The area setting information storage unit 23 stores a plurality of area setting information (also referred to as "style information") indicating user detection areas that are set (calibrated) in the field as initial settings in advance. The area setting information storage unit 23 stores only the same number of area setting information as the types (shapes and sizes) of the patterns of the user detection areas. For example, in a 5-story building, if the pattern of the user detection area differs for each floor, 5 pieces of area setting information are stored in the area setting information storage unit 23.
The area setting information is data obtained by associating area information with edge/corner information. The area information indicates a user detection area set on a predetermined floor. The borderline/corner information includes borderline information and corner information, and the borderline information indicates the outline of the user detection area indicated by the associated area information (in other words, the boundary line between the user detection area and the area (object) other than the user detection area). The corner information indicates the feature point of the user detection area indicated by the associated area information.
For example, in the 1 floor of a predetermined building, it is found that the user detection area E1 shown in fig. 2 (a) is configured to include a plurality of borderlines E1 shown in fig. 2 (b) based on area setting information obtained by associating area information shown in fig. 2 (a) with borderline/corner information shown in fig. 2 (b) with respect to the user detection area set as the initial settingA1~eA7 and a plurality of corners c shown in FIG. 2 (b)A1、c A2。
Similarly, in 2 floors of a given building,as for the user detection area set as the initial setting, it is understood that the user detection area E2 shown in fig. 3 (a) is configured to include a plurality of borderlines E shown in fig. 3 (b) based on the area setting information obtained by associating the area information shown in fig. 3 (a) with the borderline/corner information shown in fig. 3 (b)B1~eB7 and a plurality of corners c shown in FIG. 3 (b)B1、c B2。
As shown in fig. 2 and 3, the shape of the user detection area is determined based on the specification (shape) of the hall of each floor, more specifically, the shape of the three-sided frame of each floor.
Here, referring to fig. 4, a triple frame will be described.
The triple-frame is a frame disposed around the hall door 14, and is composed of 3 frames, i.e., a left frame 16A existing on the left side as viewed from the car 11 side (existing on the right side as viewed from the hall 15 side), an upper frame 16B existing above the hall 15, and a right frame 16C existing on the right side as viewed from the car 11 side (existing on the left side as viewed from the hall 15 side). The left frame 16A and the right frame 16C contact the wall surface W of the building.
The corner of the left frame 16A on the wall W side contacting the ground is referred to as a left corner Cl, and the corner of the right frame 16C on the wall W side contacting the ground is referred to as a right corner Cr.
Among the plurality of sides constituting the inner surface of the left frame 16A, the side on the side of the hall 15 extending in the vertical direction from the floor of the hall 15 is referred to as the left side HlAAmong the plurality of sides constituting the inner side surface of the left frame 16A, the side contacting the floor of the hall 15 is referred to as the left side HlB
Further, of the plurality of sides constituting the inner surface of the right frame 16C, the right side of the hall 15 extending in the vertical direction from the floor of the hall 15 is referred to as HrAAmong the plurality of sides constituting the inner side surface of the right frame 16C, the side contacting the floor of the hall 15 is referred to as a right side HrB
I.e. the edge line e shown in fig. 2A2 is the left Hl corresponding to the three frames shown in FIG. 4AThe side line of (a). Likewise, the edge line eA3 is a radical corresponding toLeft Hl of three-frameBSide line of (e), side line eA5 is the right Hr corresponding to the three-sided boxBSide line of (e), side line eA6 is the right Hr corresponding to the three-sided boxAThe side line of (a). In addition, the corner cA1 is the corner corresponding to the left corner Cl, corner c A2 is the corner corresponding to the right angle Cr.
In addition, a sideline eA1、eAReference numeral 7 denotes a line segment for defining a user detection area, and is a line segment that does not actually exist in the hall 15. In addition, a sideline eAReference numeral 4 denotes a side line corresponding to a side of the hall 15 extending in the horizontal direction with respect to the door opening/closing direction of the hall door 14 (car door 13) among the plurality of sides constituting the threshold of the hall 15.
Since the correspondence between fig. 3 and 4 is the same as the correspondence between fig. 2 and 4, detailed description thereof will be omitted here.
The description returns to fig. 1 again. The area setting selection unit 24 sets a user detection area suitable for the floor where the car 11 is currently stopped. The area setting selection unit 24 further includes a borderline/corner extraction unit 24a, a borderline/corner matching unit 24b, and the like.
The borderline/corner extraction unit 24a extracts a borderline and a corner from an image (hereinafter referred to as "initial image at full open") that is first captured after the car 11 is leveled at a predetermined floor and the car door 13 is fully opened. More specifically, the edge/corner extracting unit 24a extracts edges and corners from the fully-opened initial image by performing known image processing such as Sobel edge detection and SIFT corner feature point detection on the fully-opened initial image. The extraction result indicating the extracted edge and corner is sent to the edge/corner matching unit 24 b.
The edge/corner matching unit 24b compares and matches the edge and corner extracted from the full-open initial image and indicated by the extraction result of the edge/corner extracting unit 24a with the edge and corner indicated by the edge/corner information included in the area setting information stored in the area setting information storage unit 23. The edge/corner matching unit 24b calculates a matching ratio (matching ratio) between the edge and corner extracted from the initial image at the full-open time and the edge and corner indicated by the edge/corner information as a result of the matching.
The processing performed by the borderline/corner matching unit 24b is repeatedly executed sequentially for all the area setting information stored in the area setting information storage unit 23.
The borderline/corner matching unit 24b selects the area setting information corresponding to the highest matching rate of the calculated matching rates. The borderline/corner matching section 24b sets the user detection area indicated by the area information included in the selected area setting information as the user detection area of the floor where the car 11 is currently stopped.
Next, an example of the operation until the user detection area is set by the image processing apparatus 20 configured as described above will be described with reference to the flowchart of fig. 5. Here, it is assumed that the area setting information shown in fig. 2 and 3 is stored in advance in the area setting information storage unit 23 in the image processing apparatus 20.
First, when state information indicating that the car 11 is on the flat floor at a predetermined floor and the car door 13 is in the fully open state is received from the elevator control device 30 (step S1), the area setting selection unit 24 in the image processing device 20 acquires a fully open initial image captured first after the car door 13 is in the fully open state from the camera 12 (step S2).
Next, the edge/corner extracting unit 24a in the area setting selecting unit 24 extracts an edge and a corner from the fully opened initial image (step S3). Here, assume a case where the edge line and corner shown in fig. 6 are extracted from the initial image at full-on.
When the edge and the corner are extracted from the initial image at the full-open time, the edge/corner matching unit 24b in the area setting selection unit 24 matches the extracted edge and corner with the edge and corner indicated by the edge/corner information included in the area setting information stored in the area setting information storage unit 23 (step S4). Then, the edge/corner matching unit 24b calculates the matching rates of the edges and corners extracted from the initial image at the full open time and the edges and corners indicated by the edge/corner information (step S5).
The borderline/corner matching unit 24b determines whether or not the matching rate is calculated for all the area setting information stored in the area setting information storage unit 23. In other words, the borderline/corner matching unit 24b determines whether or not the processing of steps S4 and S5 is executed for all the area setting information stored in the area setting information storage unit 23 (step S6).
As a result of the processing at step S6, if it is determined that the matching rate has not been calculated for all the area setting information (no at step S6), the borderline/corner matching unit 24b executes the processing at steps S4 and S5 for the area setting information for which the matching rate has not been calculated. On the other hand, if it is determined as a result of the processing at step S6 that the matching rates have been calculated for all the area setting information (yes at step S6), the process proceeds to step S7, which will be described later.
Here, the processing in steps S4 and S5 will be specifically described.
Here, as described above, it is assumed that the area setting information shown in fig. 2 and 3 is stored in the area setting information storage unit 23 in advance, and therefore, the edge and the corner shown in fig. 6 are first matched with the edge and the corner shown in fig. 2 (b). In this case, as shown by the solid line in fig. 7, the edge line e shown in fig. 2 (b)A2~eA6 and corner cA1、c A2 coincide with (match) the edges and corners shown in fig. 6.
That is, 7 edge lines e indicated by edge/corner information shown in fig. 2 (b)A1~e A7 and 2 corners cA1、c A2, 5 side lines e A2~e A6 and 2 corners cA1、c A2 correspond to the side lines and corners shown in fig. 6, and therefore, the correspondence ratio is calculated as "7/9 (═ about 78%)".
Similarly, the edge and corner shown in fig. 6 and the edge and corner shown in fig. 3 (b) are targetedAnd (6) matching. In this case, as shown by the solid line in fig. 8, the edge line e shown in fig. 3 (b)B4 coincide with (match) the edges and corners shown in fig. 6.
That is, 7 edge lines e indicated by edge/corner information shown in fig. 3 (b)B1~eB7 and corner cB1、c B2, there are only 1 side line eB4 correspond to the side lines and corners shown in fig. 6, and therefore, the correspondence ratio is calculated as "1/9 (═ about 11%)".
The explanation returns to fig. 5 again. When the matching rates are calculated for all the area setting information, the borderline/corner matching section 24b selects the area setting information corresponding to the highest matching rate among the calculated matching rates (step S7). Here, as described above, the matching rate with the area setting information shown in fig. 2 is about 78%, and the matching rate with the area setting information shown in fig. 3 is about 11%, so the area setting information shown in fig. 2 is selected.
Then, the zone setting selection unit 24 sets the user detection zone indicated by the zone information included in the selected zone setting information as the user detection zone of the floor where the car 11 is currently stopped (step S8), and ends the operation here.
That is, since the area setting information shown in fig. 2 is selected here, the user detection area E1 is set to be an appropriate user detection area.
As described above, when the user detection area for detecting the user is normally set, the user detection unit 22 in the image processing apparatus 20 starts the process for detecting the user.
In the present embodiment, the edge/corner matching unit 24b calculates the matching rate of the edge and the corner, but the present invention is not limited thereto, and the edge/corner matching unit 24b may calculate only the matching rate of the edge or only the matching rate of the corner.
In the present embodiment, the border/corner information included in the area setting information is stored in the area setting information storage unit 23 as shown in fig. 2 (b) and 3 (b), but the present invention is not limited thereto, and the border/corner information may be stored as shown in fig. 9. That is, as shown in fig. 9, the borderline/corner information may be a line segment for defining the user detection area, or may be stored so as to remove a line segment that does not actually exist in the hall 15.
Accordingly, it is possible to reduce the possibility that a line segment indicated by the borderline/corner information and not actually present in the hall 15 and an unnecessary borderline extracted by the borderline/corner extraction unit 24a erroneously match and set an inappropriate user detection area.
Further, in the present embodiment, as shown in fig. 6, the description has been made assuming that a person is not reflected in the fully-opened initial image, but actually as shown in fig. 10 (a), a person is reflected in the fully-opened initial image, and as shown in fig. 10 (b), there is a possibility that an extra edge corresponding to the person is extracted by the edge/corner extraction unit 24 a.
However, in such a case, since almost only the borderlines and corners that do not match the borderlines and corners indicated by the borderline/corner information increase, the accuracy of setting the user detection area does not significantly decrease. However, by setting only the edge and corner extracted from the initial image at the full-open time as the stationary edge and corner, it is possible to avoid the edge (extra edge) corresponding to the person who has moved shown in fig. 10 (b).
In the present embodiment, it is assumed that the area setting information storage unit 23 stores area setting information according to the specification of the hall where the building is installed, but the area setting information storage unit 23 may further store area setting information (may also be referred to as "standard pattern information") corresponding to the specification of a standard hall regardless of the specification of the hall where the building is installed.
Accordingly, even if the borderline and the corner are not extracted by the borderline/corner extraction unit 24a due to, for example, the imaging environment (for example, the imaging environment is too dark) at the time of imaging the initial image at the full open time, the area setting selection unit 24 can set the user detection area indicated by the area setting information corresponding to the standard hall specification. That is, even if the area setting information is not selected by the borderline/corner matching section 24b, it is possible to avoid the situation where the user detection area is not set.
In addition, instead of storing the area setting information corresponding to the standard hall specification, the area setting information corresponding to the standard hall specification may be replaced with the area setting information corresponding to the standard hall specification, in the area setting information corresponding to the hall specification in which the building is installed, by using the area setting information of the standard close to the standard hall specification. Alternatively, in the area setting information corresponding to the standard hall specifications, the area setting information indicating the smallest user detection area may be used instead of the area setting information corresponding to the standard hall specifications.
According to one embodiment described above, the elevator system includes the image processing device 20 having the area setting information storage unit 23 and the area setting selection unit 24, wherein the area setting information storage unit 23 stores area setting information of different user detection areas for each specification (shape) of the lobby at the time of initial setting, and the area setting selection unit 24 extracts a borderline and a corner from the initial image at the time of full opening, selects area setting information corresponding to the extracted borderline and corner, and sets the user detection area.
Thus, it is possible to provide an elevator system capable of setting a detection area suitable for a current stop floor even when floor information of an elevator is not obtained for a building having a hall with a different specification in each floor. Accordingly, it is possible to prevent the user from missing detection or excessive detection due to setting of an inappropriate detection region.
In addition, as described above, in a building having different hall specifications in each floor, even when floor information of an elevator is not obtained, a detection area suitable for the current stop floor can be set, and therefore, there is also obtained an advantage that it is not necessary to add wiring for exchanging floor information between the image processing device 20 and the elevator control device 30.
Several embodiments of the present invention have been described, but these embodiments are provided as examples and are not intended to limit the scope of the invention. These novel embodiments can be implemented in other various forms, and various omissions, substitutions, and changes can be made without departing from the spirit of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the invention described in the claims and the equivalent scope thereof.

Claims (8)

1. An elevator system is characterized by comprising:
an imaging unit which is provided in a car and can image a predetermined range in a direction from the vicinity of a door in the car to a waiting hall;
a storage unit that stores a plurality of pattern information indicating a user detection area corresponding to the shape of a hall;
a setting unit that compares the captured image captured by the imaging unit with each of the stored pattern information, selects pattern information corresponding to the shape of the lobby presented by the captured image, and sets a user detection area indicated by the pattern information;
a detection unit that detects a user in the user detection area set by the setting unit; and
and a control unit for controlling opening and closing of the door based on a detection result of the detection unit.
2. Elevator system according to claim 1,
the setting unit compares a captured image captured first when the door is fully opened with the stored style information.
3. Elevator system according to claim 1,
the storage unit stores a plurality of pattern information indicating a user detection area corresponding to a shape of a three-sided frame provided in the hall.
4. Elevator system according to claim 3,
the storage unit stores a 1 st sideline, a 2 nd sideline, and a 3 rd sideline as the pattern information, wherein the three frames are configured by 3 frames, the 1 st sideline corresponds to a side of a hall side extending in a vertical direction with respect to a hall floor among a plurality of sides respectively configuring left and right frames among the 3 frames, the 2 nd sideline corresponds to a side contacting with the hall floor among the plurality of sides respectively configuring the left and right frames, and the 3 rd sideline corresponds to a side of the hall side extending in a horizontal direction with respect to a door opening and closing direction of the door among the plurality of sides configuring a doorsill of the hall side.
5. Elevator system according to claim 4,
the setting unit extracts an edge from the captured image, matches the extracted edge with a 1 st edge to a 3 rd edge indicated by the style information, calculates a matching ratio between the extracted edge and the 1 st edge to the 3 rd edge for each style information, selects style information corresponding to a highest matching ratio among the calculated matching ratios, and sets the user detection region.
6. Elevator system according to claim 3,
the storage unit stores, as the style information, corners on the building wall surface side that are in contact with the floor of the lobby of the left and right frames of the 3 frames constituting the three-sided frame.
7. Elevator system according to claim 6,
the setting unit extracts a corner from the captured image, matches the extracted corner with a corner indicated by the style information, calculates a matching ratio between the extracted corner and the corner indicated by the style information for each style information, selects style information corresponding to a highest matching ratio among the calculated matching ratios, and sets the user detection region.
8. Elevator system according to claim 1,
the storage unit further stores standard pattern information of the user detection area corresponding to a shape of a standard hall, independently of the plurality of pattern information corresponding to the shape of the hall,
the setting unit selects the stored standard pattern information and sets the user detection area indicated by the standard pattern information when pattern information corresponding to the hall shape presented in the captured image is not obtained as the comparison result.
CN201711460846.3A 2017-03-24 2017-12-28 Elevator system Active CN108622778B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-058764 2017-03-24
JP2017058764A JP6317004B1 (en) 2017-03-24 2017-03-24 Elevator system

Publications (2)

Publication Number Publication Date
CN108622778A CN108622778A (en) 2018-10-09
CN108622778B true CN108622778B (en) 2020-05-01

Family

ID=62069427

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711460846.3A Active CN108622778B (en) 2017-03-24 2017-12-28 Elevator system

Country Status (4)

Country Link
JP (1) JP6317004B1 (en)
CN (1) CN108622778B (en)
MY (1) MY193315A (en)
SG (1) SG10201800809UA (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10837215B2 (en) * 2018-05-21 2020-11-17 Otis Elevator Company Zone object detection system for elevator system
JP6896808B2 (en) * 2019-08-09 2021-06-30 東芝エレベータ株式会社 Elevator user detection system
JP6881853B2 (en) * 2019-08-09 2021-06-02 東芝エレベータ株式会社 Elevator user detection system
JP6849760B2 (en) * 2019-08-26 2021-03-31 東芝エレベータ株式会社 Elevator user detection system
JP6843935B2 (en) * 2019-09-05 2021-03-17 東芝エレベータ株式会社 Elevator user detection system
JP6833942B1 (en) * 2019-09-10 2021-02-24 東芝エレベータ株式会社 Elevator user detection system
JP6828112B1 (en) * 2019-09-18 2021-02-10 東芝エレベータ株式会社 Elevator user detection system
JP7019740B2 (en) * 2020-03-23 2022-02-15 東芝エレベータ株式会社 Elevator user detection system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11261994A (en) * 1998-03-11 1999-09-24 Mitsubishi Electric Corp Object detector and user number detector for elevator
JP2005179030A (en) * 2003-12-22 2005-07-07 Mitsubishi Electric Corp Elevator control device
JP2007131382A (en) * 2005-11-09 2007-05-31 Hitachi Building Systems Co Ltd Cage inside monitoring device of elevator, and monitoring program
CN103663068A (en) * 2012-08-30 2014-03-26 株式会社日立制作所 Elevator door system and elevator having elevator door system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002197588A (en) * 2000-12-26 2002-07-12 Fujitsu Ltd Method for discriminating tire type of traveling vehicle and method and device for discriminating vehicle model
WO2009031376A1 (en) * 2007-09-04 2009-03-12 Mitsubishi Electric Corporation Method and device for detecting users, and control method
JP6046286B1 (en) * 2016-01-13 2016-12-14 東芝エレベータ株式会社 Image processing device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11261994A (en) * 1998-03-11 1999-09-24 Mitsubishi Electric Corp Object detector and user number detector for elevator
JP2005179030A (en) * 2003-12-22 2005-07-07 Mitsubishi Electric Corp Elevator control device
JP2007131382A (en) * 2005-11-09 2007-05-31 Hitachi Building Systems Co Ltd Cage inside monitoring device of elevator, and monitoring program
CN103663068A (en) * 2012-08-30 2014-03-26 株式会社日立制作所 Elevator door system and elevator having elevator door system

Also Published As

Publication number Publication date
JP2018162116A (en) 2018-10-18
CN108622778A (en) 2018-10-09
MY193315A (en) 2022-10-04
JP6317004B1 (en) 2018-04-25
SG10201800809UA (en) 2018-10-30

Similar Documents

Publication Publication Date Title
CN108622778B (en) Elevator system
CN108622777B (en) Elevator riding detection system
US10196241B2 (en) Elevator system
CN108622776B (en) Elevator riding detection system
JP6068694B1 (en) Elevator boarding detection system
JP6139729B1 (en) Image processing device
JP2008120548A (en) Control device of elevator
JP7230114B2 (en) Elevator user detection system
CN110294391B (en) User detection system
JP2002293484A (en) Elevator control device
CN110294371B (en) User detection system
JP2020152469A (en) Elevator user detection system
CN113023518B (en) Elevator user detection system
JP6270948B1 (en) Elevator user detection system
CN111717768B (en) Image processing apparatus and method
CN112441490B (en) User detection system for elevator
CN115703609A (en) Elevator user detection system
CN112340560B (en) User detection system for elevator
JP6828108B1 (en) Elevator user detection system
CN115108425B (en) Elevator user detection system
CN111717742B (en) Image processing apparatus and method
CN113911868B (en) Elevator user detection system
JP7305849B1 (en) elevator system
JP7358606B1 (en) elevator system
CN111453588B (en) Elevator system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1259468

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant