CN111717738A - Elevator system - Google Patents

Elevator system Download PDF

Info

Publication number
CN111717738A
CN111717738A CN201911167951.7A CN201911167951A CN111717738A CN 111717738 A CN111717738 A CN 111717738A CN 201911167951 A CN201911167951 A CN 201911167951A CN 111717738 A CN111717738 A CN 111717738A
Authority
CN
China
Prior art keywords
camera
image
mark
car
elevator system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911167951.7A
Other languages
Chinese (zh)
Other versions
CN111717738B (en
Inventor
木村纱由美
田村聪
野田周平
横井谦太朗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Elevator and Building Systems Corp
Original Assignee
Toshiba Elevator Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Elevator Co Ltd filed Critical Toshiba Elevator Co Ltd
Publication of CN111717738A publication Critical patent/CN111717738A/en
Application granted granted Critical
Publication of CN111717738B publication Critical patent/CN111717738B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/02Control systems without regulation, i.e. without retroactive action
    • B66B1/06Control systems without regulation, i.e. without retroactive action electric
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • B66B1/3415Control system configuration and the data transmission or communication within the control system
    • B66B1/3423Control system configuration, i.e. lay-out
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • B66B1/3415Control system configuration and the data transmission or communication within the control system
    • B66B1/3446Data transmission or communication within the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B5/00Applications of checking, fault-correcting, or safety devices in elevators
    • B66B5/0006Monitoring devices or performance analysers
    • B66B5/0012Devices monitoring the users of the elevator system

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Indicating And Signalling Devices For Elevators (AREA)
  • Cage And Drive Apparatuses For Elevators (AREA)
  • Image Analysis (AREA)

Abstract

The invention aims to detect the deviation of the installation position of a camera. According to an embodiment, an elevator system comprises: an image processing device which performs image processing on images including the inside of the car and the hall photographed by the camera in order to detect a user near the door of the car; and a manager terminal connected to the image processing apparatus and operable by a manager. The elevator system acquires an image captured in a state where a mark separable from a floor surface of a car and a floor surface of a hall is provided from a camera, recognizes the mark from the acquired image, displays a recognition result, specifies a position of the recognized mark in accordance with an operation of a manager on the displayed recognition result, and detects a deviation of an installation position of the camera in accordance with the specified position of the mark.

Description

Elevator system
The present application is based on Japanese patent application 2019-053670 (application date: 3/20/2019), and enjoys priority based on the application. This application is incorporated by reference into this application in its entirety.
Technical Field
Embodiments of the present invention relate to an elevator system.
Background
In recent years, various techniques have been proposed to prevent people and objects from being caught by elevator car doors. For example, the following techniques are proposed: a user moving toward the elevator is detected using a camera, extending the door opening time of the elevator door.
In such a technique, it is necessary to detect a user moving toward the elevator with high accuracy from an image captured by the camera. However, when the mounting position of the camera is displaced, the image captured by the camera is rotated or displaced in the left-right direction, and therefore the detection accuracy of the user may be degraded.
Therefore, it is desirable to realize a new technique capable of detecting a deviation of the installation position of the camera.
Disclosure of Invention
An elevator system capable of detecting a deviation of an installation position of a camera is provided.
According to an embodiment, an elevator system comprises: an image processing device capable of detecting a user near a door of the car by using an image including the inside of the car and a waiting hall captured by the camera; and a manager terminal connected to the image processing apparatus and operable by a manager. The elevator system includes an acquisition unit, a recognition unit, a display unit, a determination unit, and a detection unit. The acquisition unit acquires an image from the camera, the image being captured in a state in which a mark is provided so as to be separable from a floor surface of the car and a floor surface of the hall. The recognition unit recognizes the mark from the acquired image. The display unit displays the recognition result of the recognition unit. The determination unit determines the position of the recognized mark based on an operation of the recognition result displayed by the display unit by a manager. The detection unit detects a deviation in the mounting position of the camera based on the determined position of the mark.
Drawings
Fig. 1 is a diagram showing a schematic configuration example of an elevator system according to an embodiment.
Fig. 2 is a diagram showing an example of a hardware configuration of an image processing device included in an elevator system.
Fig. 3 is a diagram showing an image captured without a deviation in the installation position of the camera.
Fig. 4 is a diagram showing an image captured in a case where there is a displacement in the installation position of the camera.
Fig. 5 is a diagram showing an example of a marker provided in the imaging range of the camera.
Fig. 6 is a block diagram showing an example of a functional configuration of the image processing apparatus.
Fig. 7 is a flowchart showing an example of a procedure of processing of the elevator system in the calibration function.
Fig. 8 is a diagram for supplementary explanation of the flowchart shown in fig. 7, and is a diagram showing a captured image by the camera.
Fig. 9 is a diagram for explaining the misrecognition suppression function as one function of the calibration function, and shows an example of the confirmation image.
Fig. 10 is a diagram for explaining the misrecognition suppression function as one function of the calibration function, and shows an example of a display screen displayed on the administrator terminal.
Fig. 11 is a diagram for explaining the misrecognition suppression function as one function of the calibration function, and shows an example of a return image.
Fig. 12 is a diagram for explaining the misrecognition suppression function as one function of the calibration function, and is a diagram showing an example of a confirmation image different from that in fig. 9.
Fig. 13 is a diagram for explaining the misrecognition suppression function as one function of the calibration function, and shows an example of a confirmation image different from that in fig. 9 and 12.
Detailed Description
Hereinafter, embodiments will be described with reference to the drawings. The present invention is not limited to the contents described in the following embodiments. Variations that can be readily envisioned by one skilled in the art are, of course, within the scope of this disclosure. In the drawings, the dimensions, shapes, and the like of the respective portions are schematically shown in some cases by being modified from those of the actual embodiment in order to make the description clearer. In the drawings, corresponding elements are denoted by the same reference numerals, and detailed description thereof may be omitted.
Fig. 1 is a diagram showing a schematic configuration example of an elevator system according to an embodiment.
A camera 12 is provided at an upper portion of an entrance of the car 11. Specifically, a lens portion of the camera 12 is provided in a door lintel plate 11a covering an upper portion of an entrance of the car 11 in a direction in which both the inside of the car 11 and the hall 15 take pictures. The camera 12 is a small-sized monitoring camera such as an in-vehicle camera, for example, and has a wide-angle lens, and continuously captures images of several frames (for example, 30 frames/second) within 1 second.
The camera 12 may be turned on all the time to perform shooting all the time, or may be turned on at a predetermined timing to start shooting and turned off at a predetermined timing to finish shooting. For example, the camera 12 may be turned on when the moving speed of the car 11 is smaller than a predetermined value and turned off when the moving speed of the car 11 is equal to or greater than the predetermined value. In this case, when the car 11 starts decelerating to stop at a predetermined floor and the moving speed is less than a predetermined value, the camera 12 is turned on to start imaging, when the car 11 starts accelerating to a floor different from the predetermined floor and the moving speed is equal to or more than the predetermined value, the camera 12 is turned off to end imaging. That is, the imaging by the camera 12 is continued until the car 11 starts accelerating from the predetermined floor toward another floor and the moving speed is equal to or higher than the predetermined value, including the period when the car 11 stops at the predetermined floor since the car 11 starts decelerating to stop at the predetermined floor and the moving speed is smaller than the predetermined value.
The shooting range of the camera 12 is set to L1+ L2(L1 ≧ L2). L1 is a shooting range on the hall 15 side, and is set from the car door 13 toward the hall 15. L2 is a car 11 side imaging range, and is set from the car door 13 toward the car back surface. L1 and L2 indicate the range in the depth direction, and the range in the width direction (direction orthogonal to the depth direction) is at least larger than the lateral width of the car 11.
In the hall 15 at each floor, a hall door 14 is openably and closably provided at an arrival gate of the car 11. The hall doors 14 engage with the car doors 13 when the car 11 arrives, and are opened and closed. The power source (door motor) is located on the car 11 side, and the hoistway doors 14 are opened and closed only following the car doors 13. In the following description, it is assumed that the hoistway doors 14 are also opened when the car doors 13 are opened, and the hoistway doors 14 are also closed when the car doors 13 are closed.
Each image (video) continuously captured by the camera 12 is subjected to image processing in real time by the image processing apparatus 20. Specifically, the image processing apparatus 20 detects (the movement of) the user closest to the car door 13 based on a change in the brightness value of the image in a preset region (hereinafter referred to as a detection region), and determines whether or not the detected user has an intention to ride on the car 11, whether or not the detected hand or arm of the user is likely to be pulled into the door obscura, and the like. The result of the image processing by the image processing device 20 is reflected in the control processing (mainly door opening/closing control processing) of the elevator control device 30 as necessary.
The elevator control device 30 controls the opening and closing of the doors of the car doors 13 when the car 11 arrives at the waiting hall 15. Specifically, the elevator control device 30 opens the car doors 13 when the car 11 arrives at the waiting hall 15, and closes the doors after a predetermined time has elapsed.
However, when the image processing apparatus 20 detects a user who intends to get in the car 11, the elevator control apparatus 30 prohibits the door closing operation of the car doors 13 and maintains the door-opened state (extends the door-opened time of the car doors 13). When the image processing apparatus 20 detects a user who may have a hand or arm pulled into the door obscura, the elevator control apparatus 30 prohibits the door opening operation of the car door 13, reduces the door opening speed of the car door 13 from the normal time, or broadcasts a message or the like to urge the car door 13 to move away from the car 11, and notifies the user of the possibility that the hand or arm is pulled into the door obscura.
The manager terminal 40 is a terminal device held by a manager at a monitoring center that monitors an operation state of the elevator system or a maintenance person who performs maintenance inspection of the elevator system, and is connected to the image processing device 20. The administrator terminal 40 may be connected to the image processing apparatus 20 by wireless or may be connected to the image processing apparatus 20 by wire. The administrator terminal 40 includes, for example, a smartphone, a tablet terminal, and a Personal Computer (PC), and the like.
Note that, although the image processing device 20 is shown in fig. 1 as being removed from the car 11 for convenience, the image processing device 20 is actually housed in the lintel plate 11a together with the camera 12. In fig. 1, the case where the camera 12 and the image processing apparatus 20 are separately provided is illustrated, but the camera 12 and the image processing apparatus 20 may be integrally provided as one apparatus. Further, in fig. 1, the case where the image processing device 20 and the elevator control device 30 are separately provided is exemplified, but the functions of the image processing device 20 may be mounted on the elevator control device 30. In this case, the manager terminal 40 is connected to the elevator control device 30.
Fig. 2 is a diagram showing an example of the hardware configuration of the image processing apparatus 20.
As shown in fig. 2, in the image processing apparatus 20, a nonvolatile memory 22, a CPU23, a main memory 24, a communication device 25, and the like are connected to a bus 21.
The nonvolatile memory 22 stores various programs including, for example, an Operating System (OS) and the like. The program stored in the nonvolatile memory 22 includes a program for executing the image processing (more specifically, user detection processing described later) and a program for realizing a calibration function described later (hereinafter, referred to as a calibration program).
The CPU23 is, for example, a processor that executes various programs stored in the nonvolatile memory 22. Further, the CPU23 executes overall control of the image processing apparatus 20.
The main memory 24 is used as a work area and the like required when the CPU23 executes various programs, for example.
The communication device 25 has a function of controlling communication (transmission and reception of signals) with an external device such as the camera 12, the elevator control device 30, and the manager terminal 40 by wire or wireless.
Here, as described above, the image processing apparatus 20 executes the user detection processing for detecting the user closest to the car door 13 based on the change in the brightness value of the image in the preset detection area. In this user detection process, in order to focus on a change in the brightness value of an image in a detection area set in advance, it is necessary to set the detection area at a well-determined position on the image at all times.
However, in the operation of the elevator system, if the mounting position (mounting angle) of the camera 12 is displaced by an impact or the like on the car 11 or the camera 12, for example, the detection region is also displaced, and therefore the image processing device 20 focuses on a change in the luminance value of an image in a region different from a region that is actually desired to be focused on, and as a result, there is a possibility that a user (object) that needs to be detected originally cannot be detected, or a user (object) that does not need to be detected originally is erroneously detected.
Fig. 3 shows an example of an image captured when the attachment position of the camera 12 is not shifted. Although not shown in fig. 1, a threshold (threshold) (hereinafter, referred to as a car threshold) 13a for guiding opening and closing of the car door 13 is provided on the car 11 side. Similarly, a threshold (hereinafter, referred to as a hall threshold) 14a for guiding opening and closing of the hall door 14 is provided on the hall 15 side. In fig. 3, a hatched portion indicates a detection region e1 set on the image. Here, as an example, it is assumed that the detection area e1 is set so as to have a predetermined range from the long side on the car 11 side among the long sides of the rectangular car threshold 13a toward the waiting hall 15 side in order to detect a user present in the waiting hall 15. In addition, in order to suppress the hands and arms from being pulled into the door obscura, the detection area may be set on the car 11 side, or a plurality of detection areas may be set on both the hall 15 side and the car 11 side.
On the other hand, fig. 4 shows an example of an image captured when the attachment position of the camera 12 is shifted. In addition, the hatched portion in fig. 4 shows a detection region e1 set on the image, similarly to fig. 3.
As shown in fig. 4, when the attachment position of the camera 12 is shifted, the image captured by the camera 12 becomes, for example, a rotated image (tilted image) as compared with the case shown in fig. 3. However, since the detection area e1 is set at a well-determined position on the image in the same manner as in fig. 3, it is originally set so as to have a predetermined range from the long side of the rectangular car sill 13a on the car 11 side toward the lobby 15 side as shown in fig. 3, but it is set so as to have a predetermined range from a position completely unrelated to the long side of the rectangular car sill 13a as shown in fig. 4. As a result, as described above, there is a possibility that a user who originally needs to be detected cannot be detected or a user who does not need to be detected is erroneously detected. In fig. 4, the case where the image is rotated due to the deviation of the attachment position of the camera 12 is illustrated, but the same possibility is also given to the case where the image is deviated in the left-right direction due to the deviation of the attachment position of the camera 12.
Therefore, the image processing apparatus 20 of the present embodiment has a calibration function of detecting whether or not the mounting position of the camera 12 is offset, and if the offset occurs, setting a detection area at an appropriate position according to the offset. The calibration function will be described in detail below.
In addition, when the calibration function is implemented, for example, it is necessary to set the mark m shown in fig. 5 within the shooting range of the camera 12. The flag m is set, for example, by a maintenance person who performs maintenance checks of the elevator system. Here, the mark m is a square shape and includes 4 black circular marks as a pattern, but the mark m may be any mark as long as it is a quadrangle in which all 4 corners are right angles and includes a pattern that can be distinguished from other objects (for example, floor surfaces of the car 11 and the hall 15) included in the imaging range of the camera 12.
Fig. 6 is a block diagram showing an example of a functional configuration of the image processing apparatus 20 according to the present embodiment. Here, the functional configuration related to the calibration function will be mainly described.
As shown in fig. 6, the image processing apparatus 20 includes a storage unit 201, an image acquisition unit 202, an offset detection unit 203, a setting processing unit 204, a notification processing unit 205, and the like. As shown in fig. 6, the offset detection unit 203 further includes a recognition processing unit 231, a calculation processing unit 232, a detection processing unit 233, and the like.
In the present embodiment, the description has been given of the case where the units 202 to 205 are realized by the CPU23 (i.e., the computer of the image processing apparatus 20) shown in fig. 2 executing the calibration program (i.e., software) stored in the nonvolatile memory 22, for example, but the units 202 to 205 may be realized by hardware or a combination of software and hardware. In the present embodiment, the storage unit 201 is configured by, for example, the nonvolatile memory 22 shown in fig. 2 or another storage device.
The storage unit 201 stores a set value related to a calibration function. The set value related to the calibration function includes a value indicating the relative position of the mark with respect to the reference point (hereinafter, referred to as a 1 st set value). The reference point is a position serving as an index for detecting whether or not the mounting position of the camera 12 is shifted, and for example, the center of the long side on the car 11 side out of the long sides of the rectangular car sill 13a corresponds to the reference point. The reference point may not be the center of the long side on the car 11 side out of the long sides of the rectangular car sill 13a, and any position may be set as the reference point as long as the position is included in the imaging range of the camera 12 when the mounting position of the camera 12 is not shifted.
The set value related to calibration includes a value (hereinafter, referred to as a 2 nd set value) indicating a relative position of the camera 12 with respect to a reference point included in an image (reference image) in which the mounting position of the camera 12 is not displaced.
Further, the set value related to the calibration includes a value indicating a relative position of each vertex (four corners) of the car sill 13a with respect to the reference point (hereinafter, referred to as a 3 rd set value). In the present embodiment, it is assumed that the detection area is set so as to have a predetermined range from the long side of the rectangular car sill 13a on the car 11 side toward the lobby 15 side, and therefore, a value indicating the relative position of each vertex of the car sill 13a with respect to the reference point is included as the 3 rd set value described above, but the 3 rd set value is not limited to this, and a value corresponding to the area in which the detection area is to be set is set. For example, when the detection area is set near the door box for the purpose of suppressing the hand or arm from being pulled into the door box, the 3 rd set value may include a value indicating the relative position of each feature point of the door box with respect to the reference point.
The set values related to calibration include values indicating the height from the floor surface of the car 11 to the camera 12 and the angle of view (focal length) of the camera 12 (hereinafter, referred to as a camera set value).
Note that the storage unit 201 may store an image (reference image) captured when the mounting position of the camera 12 is not shifted.
The image acquisition unit 202 acquires an image (hereinafter, referred to as a captured image) captured by the camera 12 in a state where a plurality of marks m are provided on the floor surface in the car 11. In the present embodiment, it is assumed that the marks m are provided on the floor surface in the car 11 along both ends of the long side on the car 11 side out of the long sides of the rectangular car sill 13a (hereinafter, simply referred to as being provided on both ends of the car sill 13 a), but the marks m may be provided on the floor surface on the hall 15 side or on the car sill 13a or the hall sill 14a as long as the marks m are positions at which the relative positions with respect to the reference point (the center of the car sill 13a in the present embodiment) can be specified.
The offset detection unit 203 performs recognition processing on the captured image acquired by the image acquisition unit 202, and recognizes (extracts) the plurality of markers m included in the captured image. The mark m included in the captured image can be recognized by, for example, setting in advance a pattern included in the mark m, and in the case of the present embodiment, recognizing an object including 4 black circular symbols included in a square as the pattern as the mark m, or by using another known image recognition technique.
In the present embodiment, recognizing the plurality of marks m includes calculating coordinate values of the plurality of marks m on the captured image. In the present embodiment, the following is assumed: the coordinate values of the plurality of marks m on the captured image are calculated by regarding the center point (center of gravity) of a quadrangle formed by connecting the center points of 4 black circular marks included in the object recognized as the mark m. Here, the center of gravity of the quadrangle formed by connecting the center points of the 4 black circular symbols included in the object recognized as the mark m is regarded as the mark m, but which part of the object recognized as the mark m is regarded as the mark m may be arbitrarily set.
The offset detection unit 203 transmits the result of the recognition processing to the administrator terminal 40. The recognition result includes an image in which an icon indicating (the position of) the plurality of recognized marks m is superimposed on the captured image. In addition, the transmission of the recognition result is performed via the communication device 25.
The administrator terminal 40 specifies (positions of) the plurality of marks m included in the image transmitted from the image processing apparatus 20 (the offset detection unit 203), which will be described in detail later. The offset detection unit 203 detects an offset of the mounting position of the camera 12 from the plurality of marks m specified by the administrator terminal 40. The functions of the recognition processing unit 231, the calculation processing unit 232, and the detection processing unit 233 included in the offset detection unit 203 will be described below together with the description of the flowchart, and therefore, the detailed description thereof will be omitted here.
When the displacement detection unit 203 detects that the attachment position of the camera 12 is displaced, the setting processing unit 204 sets a detection area at an appropriate position corresponding to the displacement in the captured image acquired by the image acquisition unit 202. Thus, a detection area in consideration of the deviation of the attachment position of the camera 12 is set on the captured image. In addition, the coordinate values of the detection area set at an appropriate position in accordance with the offset may be stored in the storage unit 201.
When the deviation detecting unit 203 detects that the installation position of the camera 12 has deviated, the notification processing unit 205 notifies (in the case of an abnormality) that the installation position of the camera 12 has deviated to (an administrator of) a monitoring center that monitors the operation state of the elevator system and the like and (a terminal held by) a maintenance person (a terminal held by the person) who sets a mark m to perform maintenance inspection of the elevator system. Further, the notification is made, for example, via the communication device 25.
Next, a procedure of processing of the elevator system in the calibration function in the present embodiment will be described with reference to a flowchart in fig. 7. Note that the series of processing shown in fig. 7 may be performed before the operation of the elevator system, for example, in addition to the case of performing regular maintenance.
First, the image acquisition unit 202 of the image processing device 20 acquires an image (captured image) captured with a plurality of marks m provided on the floor surface in the car 11 from the camera 12 (step S1). Here, as an example, a case where the captured image i1 shown in fig. 8 is acquired by the image acquisition unit 202 is assumed. As shown in fig. 8, the captured image i1 includes two marks m1 and m2 provided at both end portions of the car sill 13 a. Further, details will be described later, but in the captured image i1, as shown in fig. 8, a mirror image m' of a marker m1 is also included.
Next, the recognition processing unit 231 included in the offset detection unit 203 performs recognition processing on the captured image acquired by the image acquisition unit 202, and recognizes (extracts) the plurality of markers m included in the captured image (step S2).
In the present embodiment, as described above, the plurality of marks m are provided at both end portions of the car sill 13 a. Thus, when a plurality of marks m are recognized from the captured image, the following problem may occur.
In general, the doorway pillar near the car sill 13a is often formed of a glossy metal material (material having specular reflection characteristics) such as aluminum or stainless steel. In recent years, for the purpose of improving the appearance, not only the entrance/exit column described above but also the side wall in the car 11 is often formed of a glossy metal material and mirror-finished. Therefore, as described above, when the plurality of marks m are provided near the portions formed of the glossy metal material, such as the both end portions of the car rocker 13a, the mirror image m 'of the marks m (the mirror image m' of the mark m1 in the case of fig. 8) may be reflected on the portions formed of the glossy metal material (the side walls in the car 11 in the case of fig. 8) as shown in fig. 8. In fig. 8, a case where 1 mirror image m 'is reflected on the side wall in the car 11 is illustrated, but a plurality of mirror images m' may be reflected on a plurality of places.
Thus, the recognition processing unit 231 may erroneously recognize the mirror image m' reflected on the portion formed of the glossy metal material as the mark m. If the mirror image m' is erroneously recognized as the mark m, the relative position of the camera 12 with respect to a reference point, which will be described later, cannot be accurately calculated, and further, there is a possibility that it is not possible to detect whether or not the mounting position of the camera 12 is shifted.
Therefore, the recognition processing unit 231 of the present embodiment has, as one of the calibration functions, an erroneous recognition suppressing function of transmitting, to the administrator terminal 40, an image in which icons indicating the positions of the plurality of marks m recognized in step S2 are superimposed on the captured image, and thus being able to confirm whether or not the mirror image m' is erroneously recognized as the mark m via the administrator terminal 40.
Specifically, as shown in fig. 9, the recognition processing unit 231 generates an image (hereinafter, referred to as a confirmation image) in which an icon (for example, an icon of × mark obtained by connecting 4 black circular marks) indicating the positions of the plurality of marks m recognized in step S2 is superimposed on the captured image, and transmits the image to the administrator terminal 40. In the present embodiment, the case where the shape of the icon is an x mark is exemplified, but the shape of the icon may be any shape as long as the position of the mark m can be uniquely specified.
In the following description, it is assumed that (an object of) the captured image i1 shown in fig. 8, (2 black circular marks on the left side in the drawing including the mark m1 and the mirror image m', and (an object of) the right 2 black circular marks in the drawing including the mark m 2) are recognized as the mark m.
The explanation of fig. 7 is returned again. When the plurality of markers m are recognized in step S2, the recognition processing unit 231 generates a confirmation image in which an icon indicating the positions of the plurality of recognized markers m is superimposed on the captured image, and transmits the confirmation image to the administrator terminal 40 (step S3). Here, as an example, a case is assumed where the confirmation image i2 shown in fig. 9 is generated and transmitted to the administrator terminal 40. As shown in fig. 9, the confirmation image i2 includes an icon ic1 of × mark connecting the mark m1 recognized as the mark m by the recognition processing unit 231 and the 4 black circular marks included in the mirror image m', and an icon ic2 of × mark connecting the 4 black circular marks included in the mark m 2.
When the confirmation image transmitted from the recognition processing unit 231 of the image processing apparatus 20 is received, the manager terminal 40 causes the display screen including the received confirmation image to be displayed on the display of the manager terminal 40 (step S4). As described above, in the case where the confirmation image i2 is transmitted in step S3, as shown in fig. 10, the display screen d1 including the confirmation image i2 is displayed on the administrator terminal 40. As shown in fig. 10, the display screen d1 includes a confirmation image i2, a button b1(NG button) pressed by the administrator when the positions of the icons ic1 and ic2 included in the confirmation image i2 are changed, and a button b2(OK button) pressed by the administrator when the positions of the icons ic1 and ic2 included in the confirmation image i2 are not changed.
Next, the administrator terminal 40 generates an image to be returned to the image processing apparatus 20 (hereinafter, referred to as a return image) in accordance with an operation of the displayed display screen by the administrator, and transmits the image to the image processing apparatus 20 (step S5). The operation of the display screen by the manager includes: an operation of pressing a button b1 included in the display screen d1 shown in fig. 10 to change the positions of icons ic1, ic2 included in the confirmation image i 2; an operation of pressing the button b2 included in the display screen d1 shown in fig. 10, or the like. Note that these operations may be performed by touching a touch-panel display when the administrator terminal 40 is a terminal provided with the touch-panel display, or may be performed via an input device such as a mouse or a keyboard when the administrator terminal 40 is a PC.
Here, as an example, it is assumed that the button b1 included in the display screen d1 is pressed, an operation of changing the position of the icon ic1 included in the confirmation image i2 (specifically, an operation of changing the position of the × mark icon ic1 connecting the 4 black circular marks included in the mark m1 and the mirror image m' to the position of the 4 black circular marks included in the connecting mark m 1) is performed, and the return image i3 shown in fig. 11 is generated.
When the button b2 included in the display screen d1 is pressed, that is, when the positions of the icons ic1 and ic2 included in the confirmation image i2 are not changed, the administrator terminal 40 may transmit the confirmation image i2 to the image processing apparatus 20 as a return image without generating the return image.
Next, when receiving the return image transmitted from the administrator terminal 40, the recognition processing unit 231 of the image processing apparatus 20 specifies the positions of the plurality of markers m included in the captured image based on the received return image (the positions of the icons superimposed on the return image) (step S6). Then, the recognition processing unit 231 calculates the relative positions of the camera 12 with respect to the markers m determined in step S6 and the 3-axis angle of the camera 12 (the mounting angle of the camera 12) based on the camera setting values (the height of the camera 12 and the angle of view of the camera 12) stored as the setting values in the storage unit 201 (step S7). As described above, when the return image i3 is transmitted in step S5, the recognition processing unit 231 specifies the positions of the markers m1 and m2 included in the captured image i1 from the return image i3 (the positions of the icons ic1 and ic2 superimposed on the return image i 3), and calculates the relative position of the camera 12 with respect to the marker m1 and the relative position of the camera 12 with respect to the marker m2 as the relative positions of the camera 12 with respect to the plurality of markers m. Further, a point p1 in the captured image i1 of fig. 8 corresponds to a portion regarded as a marker m1, and a point p2 corresponds to a portion regarded as a marker m 2.
The calculation processing unit 232 included in the offset detection unit 203 calculates the relative position of the camera 12 with respect to the reference point based on the respective relative positions of the camera 12 with respect to the plurality of marks m calculated by the recognition processing unit 231 and the 1 st set value stored as the set value in the storage unit 201 (step S8).
As described above, when the relative position of the camera 12 with respect to the markers m1 and m2 is calculated in step S7, the calculation processing unit 232 calculates the relative position of the camera 12 with respect to the reference point by combining the relative position of the camera 12 with respect to the marker m1 and the relative position of the marker m1, which is the 1 st set value, with respect to the reference point. Similarly, the calculation processing unit 232 calculates the relative position of the camera 12 with respect to the reference point by combining the relative position of the camera 12 with respect to the marker m2 and the relative position of the marker m2, which is the 1 st set value, with respect to the reference point. Note that the point p3 in the captured image i1 of fig. 8 corresponds to a reference point.
Next, the detection processing unit 233 included in the offset detection unit 203 determines whether or not the mounting position of the camera 12 is offset, based on the relative position of the camera 12 with respect to the reference point calculated by the calculation processing unit 232 and the 2 nd set value stored as the set value in the storage unit 201 (step S9). Specifically, the detection processing unit 233 determines whether or not the relative position of the camera 12 calculated by the calculation processing unit 232 with respect to the reference point matches the relative position of the camera 12 with respect to the reference point, which is the 2 nd set value, and detects whether or not the mounting position of the camera 12 is offset.
If it is determined that the relative position of the camera 12 with respect to the reference point is the same and the mounting position of the camera 12 is not displaced (yes in step S9), the detection processing unit 233 determines that the mounting position of the camera 12 is not displaced and that it is not necessary to reset the detection area, and ends the series of processing here.
On the other hand, when it is determined that the relative position of the camera 12 with respect to the reference point does not match and the mounting position of the camera 12 is offset (no in step S9), the setting processing unit 204 sets a detection region at an appropriate position corresponding to the offset of the mounting position of the camera 12 in the captured image acquired by the image acquisition unit 202, based on the relative position of the camera 12 with respect to the reference point calculated by the calculation processing unit 232, the 3 rd setting value and the camera setting value stored as the setting values in the storage unit 201 (step S10).
In the present embodiment, since it is assumed that a detection area having a predetermined range from the car sill 13a to the lobby 15 side is set, first, the setting processing unit 204 calculates the relative position of each vertex of the car sill 13a with respect to the camera 12 by combining the relative position of the camera 12 with respect to the reference point calculated by the calculation processing unit 232 and the relative position of each vertex of the car sill 13a with respect to the reference point as the 3 rd set value. Note that points p4 to p7 in the captured image i1 in fig. 8 correspond to the vertices of the car sill 13 a.
Then, the setting processing unit 204 sets the detection area based on the calculated relative position of each vertex of the car sill 13a with respect to the camera 12, the 3-axis angle of the camera 12 calculated by the recognition processing unit 231, and the angle of view of the camera 12 stored in the storage unit 201 as the camera setting value.
Thus, in the captured image i1 acquired by the image acquisition unit 202, as shown by the hatched portion in fig. 8, a detection area e1 corresponding to the deviation of the mounting position of the camera 12, that is, a detection area e1 having a predetermined range from the long side of the car sill 13a on the car 11 side toward the lobby 15 side is set.
Then, the notification processing unit 205 notifies (an administrator of) the monitoring center or (a terminal of) the maintenance person of the fact that the mounting position of the camera 12 is shifted via the communication device 25 (step S11), and ends the series of processing here.
In step S3 shown in fig. 7, the icon indicating the position of the plurality of marks m is superimposed on the captured image on the image processing apparatus 20 side to generate the confirmation image, but the confirmation image may be generated not on the image processing apparatus 20 side but on the administrator terminal 40 side. In this case, the recognition processing unit 231 of the image processing apparatus 20 may execute processing for transmitting the captured image and the coordinate values of the plurality of markers m recognized in step S2 on the captured image to the administrator terminal 40 as a recognition result, instead of step S3 described above. In this case, if the manager terminal 40 receives the captured image and the coordinate values transmitted from the recognition processing unit 231 of the image processing apparatus 20, the following process may be executed instead of the above step S4: a confirmation image is generated based on the received shot image and the coordinate values, and then a display screen including the generated confirmation image is displayed on a display.
In step S5 shown in fig. 7, the return images in which the positions of the plurality of markers m are specified are generated on the manager terminal 40 side, but the return images may be generated not on the manager terminal 40 side but on the image processing apparatus 20 side. In this case, the administrator terminal 40 may execute processing of transmitting the coordinate values of the plurality of marks m on the captured image (on the confirmation image) specified by the operation of the display screen by the administrator to the image processing apparatus 20, instead of the above-described step S5. In this case, when receiving the coordinate values transmitted from the administrator terminal 40, the image processing apparatus 20 may execute processing for specifying the positions of the plurality of markers m based on the received coordinate values and the captured image, instead of step S6 described above.
Further, in step S9 shown in fig. 7, it is determined (detected) whether or not the attachment position of the camera 12 is displaced, based on whether or not the relative position of the camera 12 with respect to the reference point included in the captured image and the relative position of the camera 12 with respect to the reference point included in the reference image match, but the following configuration may be adopted: even when the mounting position of the camera 12 is displaced, the detection area is not reset as long as the displacement is of such a degree that the accuracy of the user detection processing is not affected. That is, the processing of step S9 may be executed based on whether or not the difference (degree of deviation) between the relative position of the camera 12 with respect to the reference point included in the captured image and the relative position of the camera 12 with respect to the reference point included in the reference image is within a predetermined range, and it may be determined that the mounting position of the camera 12 is displaced when the degree of deviation is not within the predetermined range.
The setting of the detection region in the present embodiment refers to the case of resetting the already set detection region, and the setting of the detection region may be expressed as correction of the detection region. In this case, the relative position of the camera 12 with respect to the reference point and the angle of the 3-axis of the camera 12 are both values required to correct the detection region, and therefore may be expressed as a correction value.
In the present embodiment, the case where the above-described correction value is calculated on the image processing apparatus 20 side is exemplified, but the correction value may be calculated on the administrator terminal 40 side. In this case, the administrator terminal 40 stores various setting values (for example, the 1 st setting value and the camera setting value) stored in the storage unit 201 of the image processing apparatus 20, and is equipped with functions corresponding to the recognition processing unit 231 and the calculation processing unit 232 of the image processing apparatus 20 (specifically, functions for executing the processing of steps S7 and S8).
In the present embodiment, as shown in fig. 9, icons (icons ic1, ic2 in the case Of fig. 9) indicating the positions Of a plurality Of markers m are superimposed (included) on the confirmation image, but for example, as shown in fig. 12, frame lines f1, f2 may be further superimposed (included) on the confirmation image, and the frame lines f1, f2 may surround a Region (hereinafter, referred to as a Region Of Interest (ROI)) in which a plurality Of markers m can be mapped so as to include a mirror image m'. Accordingly, since the frame lines f1 and f2 surrounding the attention area are similarly displayed on the display screen d1 including the confirmation image i2, the manager can clearly grasp where the attention should be paid when specifying the positions of the plurality of markers m. That is, convenience can be improved for the administrator.
Further, in the present embodiment, the confirmation image is generated by superimposing an icon indicating the positions of the plurality of recognized marks m on the captured image, but the confirmation image may not be generated based on the captured image. As shown in fig. 13, for example, the confirmation image may be any type as long as it is a type that can grasp at least the relationship between the pattern of the mark m (black circular mark in this case) included in the captured image and the position of the pattern recognized as the mark m (4 black circular marks in this case). Even in such a case, the administrator can grasp (estimate) whether or not the mirror image m' is erroneously recognized as the mark m to some extent from the relationship between the pattern of the mark m and the positions of the patterns recognized as the mark m (the positions of the icons ic1, ic 2).
In the present embodiment, the elevator system acquires an image captured in a state where a mark m separable from the floor surface of the car 11 and the floor surface of the hall 15 is provided from the camera 12, recognizes the mark m from the acquired image, detects a deviation in the installation position of the camera 12 from the recognized mark m, and sets a set value relating to image processing (user detection processing) when a deviation in the installation position of the camera 12 is detected. The set values related to the image processing include (coordinate values of) a detection region for detecting a user closest to the car door 13, which is set for the captured image.
According to such a configuration, even when the attachment position of the camera 12 is displaced, an appropriate detection region can be set for an image (for example, a rotated image or an image displaced in the left-right direction) captured by the camera 12, and thus a reduction in detection accuracy of a user can be suppressed.
Further, in the present embodiment, the elevator system recognizes the mark m from the photographed image, displays the recognition result to the manager, and, when the recognition result is erroneous, changes the recognition result (for example, moves an icon indicating the position of the mark m) in accordance with the operation of the manager to specify the position of the mark m. With this configuration, even if the mark m is provided in the vicinity of the portion formed of the glossy metal material, it is possible to suppress erroneous recognition of the mirror image m' reflecting the portion formed of the glossy metal material as the mark m.
According to the embodiment described above, it is possible to provide an elevator system capable of detecting a deviation in the installation position of the camera 12.
Although several embodiments of the present invention have been described, these embodiments are presented as examples and are not intended to limit the scope of the invention. These new embodiments can be implemented in other various ways, and various omissions, substitutions, and changes can be made without departing from the spirit of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the invention described in the claims and the equivalent scope thereof.

Claims (11)

1. An elevator system, comprising: an image processing device capable of detecting a user near a door of the car by using an image including the inside of the car and a waiting hall captured by the camera; and a manager terminal connected to the image processing device and operable by a manager, the elevator system comprising:
an acquisition unit that acquires, from the camera, an image captured in a state in which a mark separable from a floor surface of the car and a floor surface of the hall is provided;
an identification unit that identifies the marker from the acquired image;
a display unit that displays a recognition result of the recognition unit;
a specifying unit that specifies a position of the recognized mark in accordance with an operation of the recognition result displayed by the display unit by the administrator; and
a detection unit that detects a deviation in the mounting position of the camera based on the determined position of the mark.
2. Elevator system according to claim 1,
the recognition result includes a confirmation image obtained by superimposing an icon indicating the position of the recognized mark on the acquired image,
the display unit displays the confirmation image.
3. Elevator system according to claim 1,
the recognition result includes the acquired image and the recognized coordinate value of the mark on the image,
the display unit generates a confirmation image in which an icon indicating the position of the mark is superimposed on the position indicated by the coordinate value on the image, based on the image and the coordinate value included in the recognition result, and displays the confirmation image.
4. Elevator system according to claim 2,
the confirmation image shows a region of interest in which the mark can be projected on the acquired image.
5. Elevator system according to claim 2,
the specifying unit changes the position of the recognized mark in accordance with an operation of moving the icon performed by the administrator, and specifies the position of the recognized mark.
6. Elevator system according to claim 1,
the mark is provided at a position where a relative position to a threshold for guiding opening and closing of a door of the car can be determined.
7. Elevator system according to claim 6,
the marks are provided in plurality on a floor surface in the car along both end portions of the threshold.
8. Elevator system according to claim 1,
the car control device further includes a setting unit that sets a set value relating to image processing performed by the image processing device to detect a user near a door of the car when a deviation in the installation position of the camera is detected.
9. Elevator system according to claim 8,
the detection unit calculates a relative position of the camera with respect to a reference point included in the acquired image and an attachment angle of the camera based on the determined position of the mark,
detecting a displacement of the attachment position of the camera when the calculated relative position of the camera with respect to the reference point does not coincide with a relative position of the camera with respect to a reference point included in a reference image in a case where the attachment position of the camera is not displaced,
when a deviation in the installation position of the camera is detected, the setting unit sets a set value related to the image processing based on the calculated relative position of the camera with respect to a reference point and the calculated installation angle of the camera.
10. Elevator system according to claim 8,
the setting value related to the image processing includes a region for detecting the user set for the image captured by the camera.
11. Elevator system according to claim 1,
the camera control device further includes a notification unit configured to notify a manager of occurrence of an abnormality when the deviation of the mounting position of the camera is detected.
CN201911167951.7A 2019-03-20 2019-11-25 Elevator system Active CN111717738B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019053670A JP6673618B1 (en) 2019-03-20 2019-03-20 Elevator system
JP2019-053670 2019-03-20

Publications (2)

Publication Number Publication Date
CN111717738A true CN111717738A (en) 2020-09-29
CN111717738B CN111717738B (en) 2022-11-15

Family

ID=70000890

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911167951.7A Active CN111717738B (en) 2019-03-20 2019-11-25 Elevator system

Country Status (2)

Country Link
JP (1) JP6673618B1 (en)
CN (1) CN111717738B (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5182776A (en) * 1990-03-02 1993-01-26 Hitachi, Ltd. Image processing apparatus having apparatus for correcting the image processing
JP2000272863A (en) * 1999-03-24 2000-10-03 Hitachi Ltd Monitor for passenger conveyor
JP2002232874A (en) * 2001-02-07 2002-08-16 Nissan Motor Co Ltd Method and device for detecting deviation of imaging direction of camera for vehicle
JP2004083158A (en) * 2002-08-23 2004-03-18 Mitsubishi Electric Corp Remote rescue operation control device for elevator
JP2005077107A (en) * 2003-08-29 2005-03-24 Toyota Motor Corp Method and apparatus for calibrating in-vehicle camera
US20070084675A1 (en) * 2003-10-31 2007-04-19 Pengju Kang Rf id and low resolution ccd sensor based positioning system
JP2009214964A (en) * 2008-03-07 2009-09-24 Toshiba Elevator Co Ltd Maintenance management system of elevator
JP2010190612A (en) * 2009-02-16 2010-09-02 Honda Motor Co Ltd Axis adjustment target device
WO2010113673A1 (en) * 2009-03-31 2010-10-07 アイシン精機株式会社 Calibration device, method, and program for onboard camera
WO2012124068A1 (en) * 2011-03-16 2012-09-20 三菱電機株式会社 Elevator control device
JP2012205229A (en) * 2011-03-28 2012-10-22 Hitachi Ltd Camera-installation supporting method and image recognition method
JP2015113230A (en) * 2013-12-16 2015-06-22 ニチユ三菱フォークリフト株式会社 Unmanned forklift
JP5996725B1 (en) * 2015-06-04 2016-09-21 東芝エレベータ株式会社 Elevator control panel
JP6068694B1 (en) * 2016-01-13 2017-01-25 東芝エレベータ株式会社 Elevator boarding detection system
US20170197807A1 (en) * 2016-01-13 2017-07-13 Toshiba Elevator Kabushiki Kaisha Elevator system
CN107055238A (en) * 2016-01-13 2017-08-18 东芝电梯株式会社 Image processing apparatus
CN108622751A (en) * 2017-03-24 2018-10-09 东芝电梯株式会社 The boarding detection system of elevator
JP2018158842A (en) * 2018-05-29 2018-10-11 東芝エレベータ株式会社 Image analyzer and elevator system

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5182776A (en) * 1990-03-02 1993-01-26 Hitachi, Ltd. Image processing apparatus having apparatus for correcting the image processing
JP2000272863A (en) * 1999-03-24 2000-10-03 Hitachi Ltd Monitor for passenger conveyor
JP2002232874A (en) * 2001-02-07 2002-08-16 Nissan Motor Co Ltd Method and device for detecting deviation of imaging direction of camera for vehicle
JP2004083158A (en) * 2002-08-23 2004-03-18 Mitsubishi Electric Corp Remote rescue operation control device for elevator
JP2005077107A (en) * 2003-08-29 2005-03-24 Toyota Motor Corp Method and apparatus for calibrating in-vehicle camera
US20070084675A1 (en) * 2003-10-31 2007-04-19 Pengju Kang Rf id and low resolution ccd sensor based positioning system
JP2009214964A (en) * 2008-03-07 2009-09-24 Toshiba Elevator Co Ltd Maintenance management system of elevator
JP2010190612A (en) * 2009-02-16 2010-09-02 Honda Motor Co Ltd Axis adjustment target device
WO2010113673A1 (en) * 2009-03-31 2010-10-07 アイシン精機株式会社 Calibration device, method, and program for onboard camera
WO2012124068A1 (en) * 2011-03-16 2012-09-20 三菱電機株式会社 Elevator control device
JP2012205229A (en) * 2011-03-28 2012-10-22 Hitachi Ltd Camera-installation supporting method and image recognition method
JP2015113230A (en) * 2013-12-16 2015-06-22 ニチユ三菱フォークリフト株式会社 Unmanned forklift
JP5996725B1 (en) * 2015-06-04 2016-09-21 東芝エレベータ株式会社 Elevator control panel
JP6068694B1 (en) * 2016-01-13 2017-01-25 東芝エレベータ株式会社 Elevator boarding detection system
US20170197807A1 (en) * 2016-01-13 2017-07-13 Toshiba Elevator Kabushiki Kaisha Elevator system
CN107055238A (en) * 2016-01-13 2017-08-18 东芝电梯株式会社 Image processing apparatus
CN108622751A (en) * 2017-03-24 2018-10-09 东芝电梯株式会社 The boarding detection system of elevator
JP2018158842A (en) * 2018-05-29 2018-10-11 東芝エレベータ株式会社 Image analyzer and elevator system

Also Published As

Publication number Publication date
CN111717738B (en) 2022-11-15
JP2020152547A (en) 2020-09-24
JP6673618B1 (en) 2020-03-25

Similar Documents

Publication Publication Date Title
CN108622777B (en) Elevator riding detection system
US20120176303A1 (en) Gesture recognition apparatus and method of gesture recognition
US20040227816A1 (en) Intruding object monitoring system
JP6377795B1 (en) Elevator boarding detection system
JP6317004B1 (en) Elevator system
JP6693627B1 (en) Image processing device
JP6713837B2 (en) Transport equipment control system and transport equipment control method
JP2008011497A (en) Camera apparatus
CN111717768B (en) Image processing apparatus and method
CN111717738B (en) Elevator system
JP5309499B2 (en) Security camera device in elevator
JP6125102B2 (en) Information display system
JPH1124694A (en) Instruction recognition device
CN111960206B (en) Image processing apparatus and marker
JP6806414B2 (en) Image processing device and image processing method
JP6117089B2 (en) Human detection device
JP6781291B2 (en) Image processing device
CN112340560B (en) User detection system for elevator
JP2011175405A (en) Display device and operable range display method in display device
EP3667456A2 (en) Priority-based adjustment to display content
WO2012077511A1 (en) Platform door safety apparatus
JP7009583B1 (en) Elevator remote monitoring system
JP2009129131A (en) Passage detection system and passage detection method
JP2020186124A (en) Elevator user detection system
KR101723221B1 (en) Method for recognizing motion of person's arm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant