CN117246862A - Elevator system - Google Patents

Elevator system Download PDF

Info

Publication number
CN117246862A
CN117246862A CN202310709737.XA CN202310709737A CN117246862A CN 117246862 A CN117246862 A CN 117246862A CN 202310709737 A CN202310709737 A CN 202310709737A CN 117246862 A CN117246862 A CN 117246862A
Authority
CN
China
Prior art keywords
car
person
image
unit
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310709737.XA
Other languages
Chinese (zh)
Inventor
白仓邦彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Elevator and Building Systems Corp
Original Assignee
Toshiba Elevator Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Elevator Co Ltd filed Critical Toshiba Elevator Co Ltd
Publication of CN117246862A publication Critical patent/CN117246862A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B5/00Applications of checking, fault-correcting, or safety devices in elevators
    • B66B5/0006Monitoring devices or performance analysers
    • B66B5/0012Devices monitoring the users of the elevator system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/02Control systems without regulation, i.e. without retroactive action
    • B66B1/06Control systems without regulation, i.e. without retroactive action electric
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • B66B1/3476Load weighing or car passenger counting devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B11/00Main component parts of lifts in, or associated with, buildings or other structures
    • B66B11/02Cages, i.e. cars

Abstract

Embodiments of the present invention relate to an elevator system for detecting a user using a camera installed in an elevator car, which is not affected by an image on an installation object such as a mirror or a poster in the elevator car, accurately detects a user riding in the elevator car, and prevents erroneous calculation of the number of passengers riding in the elevator car. An elevator system according to one embodiment includes an imaging unit that captures an image of a range including a hall near an entrance from within an elevator car, a person detection unit, a tracking unit, and a detection processing unit. The person detecting unit detects a person from the image obtained by the image capturing unit. The tracking section tracks the person detected by the person detecting section. The detection processing unit detects a user riding in the car based on the tracking distance and the position of the person obtained by the tracking unit.

Description

Elevator system
The present application is based on Japanese patent application 2022-097402 (application day: 2022, 6, 16 days), and enjoys priority of the application. This application is incorporated by reference into this application in its entirety.
Technical Field
Embodiments of the present invention relate to an elevator system that detects a user using a camera provided in an elevator car.
Background
Conventionally, the following systems are known: a camera is provided in the car of the elevator, and the number of users who take the elevator in the car is detected by processing an image captured by the camera, and the detection result is reflected in the operation control of the elevator.
Such a system requires accurate detection of the user by image processing. However, if a mirror is provided in the car, a user reflected on the mirror may be erroneously detected. In the same manner, when a person is attached to a poster or the like in a car, a false detection may occur with the person on the poster as a user.
If an image on an installation object such as a mirror or a poster in the car is erroneously detected as a user, the number of passengers is counted more than actually, and thus, for example, erroneous judgment of full is a factor of lowering the running efficiency.
Disclosure of Invention
The invention provides an elevator system, which is not affected by images on devices such as mirrors and posters in an elevator car, and can accurately detect users riding in the elevator car, and prevent erroneous calculation of the number of passengers riding in the elevator.
An elevator system according to one embodiment includes an imaging unit that captures an image of a range including a hall near an entrance from within an elevator car, a person detection unit, a tracking unit, and a detection processing unit. The person detecting unit detects a person from the image obtained by the image capturing unit. The tracking section tracks the person detected by the person detecting section. The detection processing unit detects a user riding in the car based on the tracking distance and the position of the person obtained by the tracking unit.
According to the elevator system having the above configuration, it is possible to accurately detect a user riding in the car without being affected by an image on an installation object such as a mirror or a poster in the car, and to prevent erroneous calculation of the number of passengers riding in the car.
Drawings
Fig. 1 is a diagram showing a configuration of an elevator system according to an embodiment.
Fig. 2 is a view showing a configuration of an entrance peripheral portion in the car according to the embodiment.
Fig. 3 is a view showing an example of an image captured by the camera according to this embodiment.
Fig. 4 is a diagram showing an example of an image of the frame N in this embodiment.
Fig. 5 is a diagram showing an example of an image of the frame n+1 in this embodiment.
Fig. 6 is a view showing an example of a captured image in this embodiment, and shows a state in which a user returns to a hall after riding in a car.
Fig. 7 is a view showing an example of a captured image in the embodiment, and shows a state in which a person poster is attached to the car.
Fig. 8 is a flowchart for explaining the operation of the elevator system in this embodiment.
Fig. 9 is a flowchart showing details of the user detection process performed in step S16 of fig. 8.
Fig. 10 is a flowchart for explaining a switchover to full operation as an application example.
Fig. 11 is a flowchart for explaining the switching to the full-load operation in consideration of the load value as an application example.
Fig. 12 is a flowchart for explaining the failure determination as an application example.
Detailed Description
Hereinafter, embodiments will be described with reference to the drawings.
The disclosure is merely an example, and the invention is not limited to the following embodiments. Variations that would be readily apparent to one skilled in the art are of course included within the scope of this disclosure. In order to make the description more clear, the dimensions, shapes, and the like of the respective portions may be schematically shown in the drawings by changing them with respect to the actual embodiments. Corresponding elements are denoted by the same reference numerals in the various drawings, and detailed description thereof is sometimes omitted.
Fig. 1 is a diagram showing a configuration of an elevator system according to an embodiment. In this case, 1 car is taken as an example, but a plurality of cars are similarly configured.
A camera 12 serving as an imaging unit is provided above the entrance of the car 11. Specifically, the camera 12 is provided in a door header plate 11a covering an upper portion of an entrance of the car 11 so that a lens portion faces directly downward. The camera 12 has an ultra-wide angle lens such as a fish eye lens, and photographs a subject including the car 11 over a wide range at a field angle of 180 degrees or more. The camera 12 is capable of capturing images of several frames (e.g., 30 frames/second) in succession within 1 second.
The camera 12 may not be located above the entrance of the car 11 as long as it is located near the car door 13. For example, the elevator system may be a place where the entire car room including the entire area of the floor surface in the car 11 and the hall 15 near the entrance when the door is opened can be imaged, such as a ceiling surface near the entrance of the car 11.
A load detector 10 for detecting a load value in the car 11 is provided at the bottom of the car 11. The load value detected by the load detector 10 is transmitted to the elevator control device 30 via a transmission cable (lead) not shown.
A hall door 14 is provided in the hall 15 of each floor so as to be openable and closable at the entrance of the car 11. The hall doors 14 engage with the car doors 13 to perform opening and closing operations when the car 11 arrives. The power source (door motor) is located on the car 11 side, and the hall door 14 is opened and closed only following the car door 13. In the following description, the hoistway door 14 is also opened when the car door 13 is opened, and the hoistway door 14 is also closed when the car door 13 is closed.
Each image (video) continuously captured by the camera 12 is analyzed and processed in real time by the image processing device 20. In fig. 1, the image processing device 20 is taken out of the car 11 for convenience, but in practice, the image processing device 20 is housed in the door lintel plate 11a together with the camera 12.
The image processing apparatus 20 has a storage section 21 and a detection section 22. The storage section 21 has a buffer area for sequentially storing images captured by the camera 12 and temporarily storing data necessary for processing by the detection section 22. As the preprocessing of the captured image, the storage unit 21 may store an image subjected to the processing such as distortion correction, enlargement and reduction, and partial cutting.
The detection unit 22 is constituted by, for example, a microprocessor, and detects a user located in the car 11 or in the hall 15 using the captured image of the camera 12. The detection unit 22 is functionally divided into a person detection unit 22a, a tracking unit 22b, and a detection processing unit 22 c. These may be realized by software, hardware such as IC (Integrated Circuit), or a combination of software and hardware. In addition, the elevator control device 30 may be provided with a part or all of the functions of the image processing device 20.
The person detecting unit 22a detects a person from the image obtained by the camera 12. The tracking unit 22b tracks the person detected by the person detecting unit 22 a. As the human detection method, for example, a cofog (Co-Occurrence Histograms of Oriented Gradients: luminance gradient direction symbiotic histogram) feature value is used. As the person tracking method, a method of tracking a person specified by a coog feature amount between frame images is used.
The detection processing unit 22c detects a user riding in the car 11 based on the tracking distance and the position of the person obtained by the tracking unit 22 b. The "tracking distance" is an accumulated distance to the current time point after tracking of the object is started on the captured image. The "position" is an x-Y coordinate position of the object on the captured image, and includes at least a Y-direction (a direction orthogonal to the door opening/closing direction) coordinate position. The detection processing unit 22c has a function of counting the number of passengers when a user who is riding in the car 11 is detected.
Here, the "tracking distance" is a distance that the person moves from the start of the tracking of the person to the current time point, and is independent of the direction of movement. When the following distance and position of the person satisfy the following determination conditions, the detection processing unit 22c determines that the person is a user riding in the car 11.
The tracking distance of the person is greater than or equal to a predetermined distance Th
The person's position is lower than the reference position Py
The reference position Py is determined as a Y-direction coordinate position of one side portion of the car floor surface 19 side of the car threshold 47 (see fig. 4).
The elevator control device 30 is constituted by a computer provided with a CPU, ROM, RAM or the like. The elevator control device 30 includes an operation control unit 31, a door opening/closing control unit 32, a notification unit 33, and a failure diagnosis unit 34.
The operation control unit 31 performs operation control of the car 11. Specifically, when the number of passengers obtained from the captured image of the camera 12 exceeds a certain number of passengers, the operation control unit 31 performs operation control such as full-man operation to switch to a registered floor by hall call. The "number of passengers obtained from the captured image of the camera 12" refers to the number of passengers counted by the detection processing unit 22 c. The door opening/closing control unit 32 controls the opening/closing of the doors 13 of the car 11 when the car arrives at the hall 15. Specifically, the door opening/closing control unit 32 opens the car door 13 when the car 11 reaches the hall 15, and closes the door after a predetermined time elapses.
Here, for example, when the detection processor 22c detects a user in the vicinity of the car door 13 during the door opening operation of the car door 13, the door opening/closing controller 32 performs door opening/closing control for avoiding a door accident (a pull-in accident into the door box). Specifically, the door opening/closing control unit 32 temporarily stops the door opening operation of the car door 13, moves in the opposite direction (door closing direction), or slows down the door opening speed of the car door 13. The notification unit 33 draws attention from the user in the car 11 based on the detection result of the detection processing unit 22 c.
The elevator control device 30 further includes a fault diagnosis unit 34. The fault diagnosis unit 34 performs fault diagnosis on the camera 12 or the load detector 10 based on an error between the number of passengers obtained from the captured image of the camera 12 and the number of passengers calculated from the load value of the car 11.
Fig. 2 is a view showing a configuration of the surrounding portion of the doorway in the car 11.
A car door 13 provided to be openable and closable in an entrance of the car 11. In the example of fig. 2, a double door type car door 13 is shown, and two door panels 13a, 13b constituting the car door 13 are opened and closed in opposite directions along the front width direction (horizontal direction). The "front width" is the same as the entrance of the car 11.
Front posts 41a, 41b are provided on both sides of the entrance of the car 11, and surround the entrance of the car 11 together with the door lintel plate 11 a. The "front column" is also called an entrance column or an entrance frame, and a door camera for accommodating the car door 13 is generally provided on the rear surface side. In the example of fig. 2, when the car door 13 is opened, one door panel 13a is accommodated in a door box 42a provided on the rear surface side of the front column 41a, and the other door panel 13b is accommodated in a door box 42b provided on the rear surface side of the front column 41 b.
One or both of the front posts 41a and 41b are provided with a display 43, an operation panel 45 provided with a destination floor button 44, and the like, and a speaker 46. In the example of fig. 2, a speaker 46 is provided on the front pillar 41a, and a display 43 and an operation panel 45 are provided on the front pillar 41 b.
As shown in fig. 3, a rectangular mirror 50 is provided at a position facing the entrance of the rear surface 49 of the car 11. The mirror 50 is used, for example, as a rearview mirror when a wheelchair user gets off the elevator from the car 11. In addition, as the mirror 50, not only "glass type" but also "stainless steel mirror type" is included.
A camera 12 having an ultra-wide angle lens such as a fish eye lens is provided in a central portion of a door lintel plate 11a at an upper portion of an entrance of the car 11. The camera 12 captures images of the hall 15 in the car 11 and in the vicinity of the doorway at a predetermined frame rate (e.g., 30 frames/second). The image captured by the camera 12 is supplied to an image processing device 20 shown in fig. 1 for detection processing for detecting a user or an object.
Fig. 3 is a diagram showing an example of a captured image of the camera 12. The elevator hall 15 is shown in a state where the car door 13 (door panels 13a, 13 b) and the elevator hall door 14 (door panels 14a, 14 b) are fully opened, and the entire car room and the vicinity of the doorway are photographed at an angle of view of 180 degrees or more from the upper part of the doorway of the car 11. The hall 15 is located at the upper side, and the car 11 is located at the lower side.
In the hall 15, door pockets 17a and 17b are provided on both sides of the entrance of the car 11, and a belt-shaped hall sill 18 having a predetermined width is disposed on a floor surface 16 between the door pockets 17a and 17b in the opening/closing direction of the hall door 14. Further, a belt-shaped car threshold 47 having a predetermined width is disposed on the entrance side of the floor surface 19 of the car 11 in the opening/closing direction of the car door 13.
Here, when the mirror 50 is provided in the car 11, there is a problem that a mirror image of a user who has a mirror 50 in a captured image is erroneously detected, and the number of passengers is double counted. In this case, if an area (mirror area) corresponding to the installation place of the mirror 50 is masked on the captured image, false detection by the mirror 50 can be prevented. However, in order to set the mirror region, information (installation place, etc.) on the mirror 50 needs to be provided to the system for each object in advance, which is very troublesome.
A method of detecting a user riding in the car 11 and accurately counting the number of passengers without being affected by a mirror image or the like of the user reflected in the mirror 50 on the captured image will be described below.
Fig. 4 is a diagram showing an example of an image of a frame N, and fig. 5 is a diagram showing an example of an image of a frame n+1 (N is an arbitrary integer). Here, in order to facilitate understanding of the change in the image, the interval between the frame N and the frame n+1 is made longer for convenience. In the figure, P1 represents the user, and the broken line represents the movement locus of the user P1. A mirror 50 is provided on a position facing the entrance on the rear surface 49 of the car 11, and a mirror image of the user P1 is reflected on the mirror 50.
When the car 11 is opened, a person is detected from an image captured by the camera 12. As the person detection method, for example, "cooong feature quantity" is used. The "cofog feature amount" is a result obtained by creating a histogram from a combination of luminance gradient intensities in each pixel in an image, quantifying a combination of vertical, horizontal, and diagonal edges in a block, joining the histograms obtained for each block, and vectorizing the joined histograms. The shape of the person is learned in advance using the cooong feature quantity, and the person is detected based on the learning result and the cooong feature quantity extracted from the current image.
Now, the user P1 is detected as a person and given ID11 as identification information. The motion of the user P1 of the ID11 is tracked based on each image obtained from the camera 12 at a predetermined frame rate. As a person tracking method, a method of tracking the same person using a coog feature amount between frame images is used.
Here, if the mirror 50 is provided in the car 11, there is a possibility that the image of the user P1 reflected on the mirror 50 may be erroneously detected as a person. The mirror image only appears in the mirror 50 and moves within the scope of the mirror 50. Therefore, it is known that, when the mirror image motion is tracked, the distance (tracking distance) obtained by the tracking process is shorter than the real image. In the present embodiment, the person having the tracking distance equal to or greater than the predetermined distance Th is determined as the real image of the user P1, focusing on the difference in the tracking distance between the mirror image and the real image. The "fixed distance Th" is determined by the average distance that the user detected on the captured image moves from the hall 15 to the vicinity of the front surface of the car 11, taking into consideration the position of the camera 12, the size of the car 11, the front surface width, the width dimension in the Y direction of the side surfaces of the door pockets 17a, 17b, and the like.
Further, even if the real image of the user P1 can be discriminated by the tracking process, the user P1 can be considered to move around the hall 15. Therefore, when the upper side of the captured image is set as the hall 15, the Y-direction coordinate position of the side portion of the car floor surface 19 side of the car threshold 47 existing near the center on the captured image is set as the reference position Py. If it is determined by the tracking process that the Y-direction position of the user P1 is lower than the reference position Py, the user P1 is determined to be a user riding in the car 11, and the number of passengers is counted.
In the example of fig. 4, the tracking distance of the user P1 is shorter than the predetermined distance Th, and the Y-direction position of the user P1 is higher than the reference position Py. At this point in time, the user P1 has not been identified as a user riding in the riding car 11. In the example of fig. 5, the user P1 is at a distance Th or more, and the position of the user P1 in the Y direction is lower than the reference position Py. At this time point, the user P1 is considered as the user riding in the car 11, and the number of passengers is counted by +1.
On the other hand, as shown in fig. 6, the user P2 to which the identification information ID12 is given returns to the hall 15 after riding the car 11. In this case, the tracking distance satisfies the condition of the certain distance Th, but the position of the user P2 in the Y direction is higher than the reference position Py. Therefore, the user P2 is excluded from the number of passengers.
The installation position of the mirror 50 is not limited to the rear surface 49 of the car 11, and may be the side surfaces 48a, 48b. There is little movement of the mirror image of the person that is reflected on the mirror 50 wherever the mirror 50 is disposed within the car 11. Therefore, the mirror image of the person reflected on the mirror 50 can be excluded from the detection target according to the tracking distance, and erroneous counting of the number of passengers can be prevented.
As shown in fig. 7, the same applies to, for example, a poster 51 in which a person is stuck to the car 11. The person of the poster 51 may be erroneously detected, but since the person of the poster 51 does not operate, it can be excluded from the detection objects according to the tracking distance. The same applies to the case where a monitor is provided in the car 11. When an image of a person is displayed on a monitor, the motion of the person is eliminated from the detection target because the tracking distance obtained by the tracking process between frame images is small on the monitor screen.
Next, an operation of the elevator system in the present embodiment will be described.
Fig. 8 is a flowchart for explaining the operation of the elevator system. The processing shown in the flowchart is mainly performed by the image processing apparatus 20.
Now, assume a case where the car 11 opens at any floor. The camera 12 provided in the car 11 captures images of the hall 15 in the car 11 and in the vicinity of the doorway at a predetermined frame rate. The images captured by the camera 12 are stored in the storage section 21 of the image processing apparatus 20 in time series.
When the car 11 is opened (yes in step S11), the detection unit 22 of the image processing device 20 reads out the images stored in the storage unit 21 in time series, and detects a person from the images (step S12). The person detection process and the tracking process use a generally known method such as "CoHOG feature amount" described above, and detailed description thereof will be omitted here.
Here, when a new person is detected (yes in step S13), the detection unit 22 adds identification Information (ID) to the person, and starts tracking of the person (step S14). The detection unit 22 performs tracking processing for the same person to which the identification information is added in each of the images obtained from the camera 12 in time series (step S15). The detection unit 22 detects a user riding in the car 11 based on the tracking distance and the position of the person obtained by the tracking process (step S16).
Specifically, as shown in fig. 9, the detection unit 22 first checks the tracking distance of the person and determines whether or not the tracking distance is equal to or greater than a predetermined distance Th (step S21). When the tracking distance is smaller than the predetermined distance Th (no in step S21), the detection unit 22 determines that the person being tracked is a mirror image of the person being imaged on the mirror 50 or an image on an installation object such as the poster 51, and excludes the person from the detection object (step S24).
On the other hand, when the tracking distance is equal to or greater than the predetermined distance Th (yes in step S21), the detection unit 22 determines whether or not the position of the person being tracked is lower than the reference position Py (step S22). As described above, the reference position Py is determined as the position in the Y direction of the car threshold 47 existing near the center on the captured image. If the position of the person being tracked in the Y direction is lower than the reference position Py (yes in step S22), the detection unit 22 determines that the person is a user riding in the car 11 (step S23). If the position of the person being tracked in the Y direction is higher than the reference position Py (no in step S22), the detection unit 22 determines that the person is a user located in the hall 15 and excludes the person from the detection target (step S24).
In this way, when it is determined that the person being tracked is a user riding in the car 11, the detection unit 22 includes the user in the count of the number of passengers (step S17 in fig. 8). While the car 11 is open, the above-described process is returned to count the number of passengers for the user riding in the car 11. In this case, since the mirror image of the person reflected on the mirror 50, the person on the poster 51, and the like are excluded from the detection objects, the number of passengers is not counted erroneously.
The user continues the tracking process even after riding in the car 11. Therefore, as shown in fig. 6, when the user P2 who has counted as the number of passengers gets off the elevator car 11, the current number of passengers can be corrected by setting the counted value of the number of passengers to-1. In this case, since the user is identified by the identification information, it is possible to determine who gets off the elevator. In addition, a user who is riding in the vicinity of a door in the car 11 may once get out of the hall 15 and then ride again in order to get off the elevator by a user who is behind. As described above, since the user is specified by the identification information, if the user is within the same image, the user can be identified as having once arrived at the hall 15 and then having carried on the elevator again, and can be accurately reflected in the count value of the number of persons carrying the elevator. In this case, the count value of the number of passengers on the elevator becomes-1 when the user gets off the elevator, and the count value becomes +1 when the user gets on the elevator again.
As described above, according to the present embodiment, by focusing on the tracking distance (the accumulated distance after the start of tracking) of the person detected on the captured image, when the mirror 50 is provided in the car 11, the mirror image of the person reflected on the mirror 50 can be excluded from the detection target. Therefore, it is not necessary to shield the mirror region in advance in the captured image, and, for example, even if a person is tracked three-dimensionally without using a stereo camera, it is possible to accurately detect only the user riding in the car 11 based on the tracking distance and position of the person obtained by only one camera 12, and reflect the detection result to the number of passengers.
The same applies to the case where the poster 51 is attached to the car 11 or the case where a monitor is provided, and the images on these devices can be excluded from the detection targets according to the tracking distance, so that only the user who is riding in the car 11 can be accurately detected, and the detection result can be reflected in the number of passengers.
In the above embodiment, the explanation has been given assuming that the car 11 is opened, but the mirror image of the person on the mirror 50, the person on the poster 51, or the like is excluded from the detection objects based on the tracking distance and the position obtained by the tracking process, similarly, when the car is closed, and only the user in the car 11 can be accurately detected.
(application example)
If the number of passengers is obtained from the captured image of the camera 12, the following can be achieved.
(1) Display of conditions in a car
For example, the number of passengers is displayed as the situation in the car on an unillustrated indicator provided in the hall 15. Specifically, the elevator control device 30 acquires the number of passengers counted by the detection unit 22 from the image processing device 20, and displays the number of passengers as the current in-car situation on the indicator of the hall 15 at the floor where the car 11 is flat. Thus, the user in the hall 15 can grasp the current number of passengers, and can take actions such as not taking a ladder when the number of passengers is large.
(2) Switching to full-man operation
The number of passengers obtained from the captured image of the camera 12 does not include the mirror image of the person shown in the mirror 50, the person shown in the poster 51, and the like, and only the users who have ridden in the car 11 are correctly counted. Therefore, if the number of passengers is used, the full state can be accurately determined, and the operation can be switched to the full operation of the registered floor by hall call.
Specifically, as shown in fig. 10, when the car 11 is closed, the elevator control device 30 acquires the number of passengers counted by the detection unit 22 from the image processing device 20 (step S31). If the number of passengers is within a certain number specified as the rated number of passengers of the car 11 (yes in step S32), the elevator control device 30 stops the passenger car 11 at the registered floor of the hall call and performs normal operation (step S33). On the other hand, when the number of passengers exceeds a certain number (no in step S32), the elevator control device 30 switches to the full-load operation to stop the elevator car 11 from calling at the registered floor of the hall and to move straight to the destination floor of the user (step S33). In addition, at the registered floor of the hall call, another car is caused to respond, or the user in the car 11 gets off the destination floor and then responds.
(3) Switching to full-load operation taking into account load values
Even if only the number of passengers in the car 11 is correctly counted from the captured image of the camera 12, the load value of the car 11 is changed by the weight of the user, the load, or the like, for example. Therefore, the load value is preferably included in the condition for full person determination, in addition to the number of passengers.
Specifically, as shown in fig. 11, when the car 11 is closed, the elevator control device 30 acquires the number of passengers counted by the detection unit 22 from the image processing device 20 (step S41). If the number of passengers is within a certain number specified as the rated number of passengers of the car 11 (yes in step S42), the elevator control device 30 acquires the current load value of the car 11 from the load detector 10 (step S43). If the load value is within the predetermined load value specified as the rated load value of the car 11 (yes in step S44), the elevator control device 30 stops the car 11 at the registered floor called in the hall to perform the normal operation (step S45). On the other hand, when the number of passengers exceeds a certain number of passengers (no in step S42) or when the load value exceeds a certain load value (no in step S44), elevator control device 30 switches to the full-man operation at the registered floor called by the hall call (step S46).
(4) Fault determination
By comparing the number of passengers obtained from the captured image of the camera 12 with the number of passengers calculated from the load value of the load detector 10, it is possible to determine a failure of the camera 12 or the load detector 10.
Specifically, as shown in fig. 12, when the car 11 is closed, the elevator control device 30 acquires the current load value of the car 11 from the load detector 10 (step S51), and calculates the number of passengers from the load value (step S52). For example, when the load value is 500kg, if the average load of the user is 60kg, the number of passengers=8 is obtained.
Here, the elevator control device 30 compares the number of passengers calculated from the load value with the number of passengers obtained from the image processing device 20 (step S53). As a result, if there is an error of a certain value or more in the number of passengers (yes in step S54), the elevator control device 30 determines that the camera 12 or the load detector 10 has failed, and reports the failure to a monitoring center (not shown), for example, to deal with the failure (step S55).
According to at least one embodiment described above, it is possible to provide an elevator system capable of accurately detecting a user riding in an elevator car without being affected by an image on an installation object such as a mirror or a poster in the elevator car, and preventing erroneous calculation of the number of passengers.
In addition, although several embodiments of the present invention have been described, these embodiments are presented by way of example and are not meant to limit the scope of the invention. These novel embodiments can be implemented in various other modes, and various omissions, substitutions, and changes can be made without departing from the spirit of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the invention described in the claims and their equivalents.

Claims (10)

1. An elevator system, comprising:
an imaging unit that images a range including a hall near an entrance from within the car;
a person detection unit that detects a person from the image obtained by the imaging unit;
a tracking section that tracks the person detected by the person detecting section; and
and a detection processing unit configured to detect a user riding in the car based on the tracking distance and the position of the person obtained by the tracking unit.
2. An elevator system according to claim 1, characterized in that,
when the person tracking distance is equal to or greater than a predetermined distance and the hall is located above the image, the detection processing unit determines that the person is a user riding in the car if the person is located below a predetermined reference position.
3. An elevator system according to claim 2, characterized in that,
the detection processing unit excludes the person from the detection target when the tracking distance of the person is smaller than the predetermined distance or the position of the person is higher than the reference position.
4. An elevator system according to claim 3, characterized in that,
when the person tracking distance is smaller than the predetermined distance, the detection processing unit determines that the person tracking distance is an image on an installation object including a mirror installed in the car, and excludes the image from the detection object.
5. An elevator system according to claim 3, characterized in that,
when the person is located above the reference position, the detection processing unit determines that the person is a user located in the hall.
6. An elevator system according to claim 2, characterized in that,
the reference position is defined as a position of a threshold of the car on the image.
7. An elevator system according to claim 1, characterized in that,
when the detection processing unit detects a user riding in the elevator car, the elevator riding number is counted.
8. An elevator system according to claim 1, characterized in that,
comprises an operation control part for controlling the operation of the car,
when the number of passengers obtained from the image of the image pickup unit exceeds a predetermined number, the operation control unit switches to a full-length operation of the registered floor by hall call.
9. The elevator system of claim 8, wherein,
comprises a load detection part arranged on the car,
the operation control unit switches to the full-man operation in consideration of the load value detected by the load detection unit.
10. An elevator system according to claim 1, characterized in that,
comprises a load detection part arranged on the car,
the vehicle further includes a fault diagnosis unit that performs fault diagnosis for the image pickup unit or the load detection unit based on an error between the number of passengers obtained from the image of the image pickup unit and the number of passengers calculated from the load value detected by the load detection unit.
CN202310709737.XA 2022-06-16 2023-06-15 Elevator system Pending CN117246862A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-097402 2022-06-16
JP2022097402A JP7322250B1 (en) 2022-06-16 2022-06-16 elevator system

Publications (1)

Publication Number Publication Date
CN117246862A true CN117246862A (en) 2023-12-19

Family

ID=87519656

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310709737.XA Pending CN117246862A (en) 2022-06-16 2023-06-15 Elevator system

Country Status (2)

Country Link
JP (1) JP7322250B1 (en)
CN (1) CN117246862A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117387629B (en) * 2023-12-12 2024-03-12 广东车卫士信息科技有限公司 Indoor navigation route generation method and system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113165831B (en) 2018-11-30 2022-06-28 株式会社日立制作所 System for acquiring information on number of people, method for acquiring information on number of people, and elevator
JP7074161B2 (en) 2020-07-07 2022-05-24 フジテック株式会社 Image processing device and image processing method
JP7043565B2 (en) 2020-10-20 2022-03-29 東芝エレベータ株式会社 Elevator user detection system

Also Published As

Publication number Publication date
JP2023183735A (en) 2023-12-28
JP7322250B1 (en) 2023-08-07

Similar Documents

Publication Publication Date Title
CN113428752B (en) User detection system for elevator
CN117246862A (en) Elevator system
JP2021127251A (en) User detection system of elevator
JP6702578B1 (en) Elevator user detection system
CN112429609B (en) User detection system for elevator
CN111689324B (en) Image processing apparatus and image processing method
CN112340560B (en) User detection system for elevator
CN113428750B (en) User detection system for elevator
CN112456287B (en) User detection system for elevator
CN112551292B (en) User detection system for elevator
JP7282952B1 (en) elevator system
CN112441497B (en) User detection system for elevator
JP7305849B1 (en) elevator system
CN113911868B (en) Elevator user detection system
JP7077437B2 (en) Elevator user detection system
JP7375105B1 (en) elevator system
JP6702579B1 (en) Elevator user detection system
JP7135144B1 (en) Elevator user detection system
JP2022036276A (en) User detection system of elevator
CN112520525A (en) User detection system for elevator
JP2020186124A (en) Elevator user detection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination