GB2294114A - System for counting the number of people waiting for a lift - Google Patents

System for counting the number of people waiting for a lift Download PDF

Info

Publication number
GB2294114A
GB2294114A GB9520604A GB9520604A GB2294114A GB 2294114 A GB2294114 A GB 2294114A GB 9520604 A GB9520604 A GB 9520604A GB 9520604 A GB9520604 A GB 9520604A GB 2294114 A GB2294114 A GB 2294114A
Authority
GB
United Kingdom
Prior art keywords
region
head
computes
certainty level
waiting passenger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9520604A
Other versions
GB2294114B (en
GB9520604D0 (en
Inventor
Kwang Sik Wang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LS Electric Co Ltd
Original Assignee
LG Industrial Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Industrial Systems Co Ltd filed Critical LG Industrial Systems Co Ltd
Publication of GB9520604D0 publication Critical patent/GB9520604D0/en
Publication of GB2294114A publication Critical patent/GB2294114A/en
Application granted granted Critical
Publication of GB2294114B publication Critical patent/GB2294114B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Indicating And Signalling Devices For Elevators (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

A difference picture is produced using an input picture of people waiting in an environment and a reference picture of the environment with no people waiting. A partition mode test step divides the input picture into regions using brightness information from the input picture and the difference picture. Further steps use different methods to make two estimations of the number of regions corresponding to human heads. A waiting passenger number computation step uses these two estimations to calculate a figure for the number of waiting people. <IMAGE>

Description

METHOD OF COMPUTING NUMBER OF WAITING PASSENGER FOR ELEVATOR BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a method of computing the number of a waiting passenger for an elevator, and particularly to an improved method of computing the number of a waiting passenger for an elevator using a vision system capable of computing the number of a waiting passenger using a computer vision system.
2. Description of the Conventional Art In the industry, advanced technologies for a group management control for an elevator have been developed together with related hardware and software field. The group management control for an elevator is to allocate a proper car, which is in the utmost condition, for a useis car call. So as to execute the above-mentioned technologies in the industry, it is most important to correctly predict the traffic flow amount. That is, it is necessary to correctly predict the number of a getting-off and getting-on passenger at each floor until a specific car arrives at a predetermined floor. There are two method of predicting the number of a waiting passenger at each floor; one of which is based on a statistic data and the other of which is based on a vision system.
The construction of a conventional apparatus for computing the number of a waiting passenger for an elevator using a statistic data will now be explained with reference to Fig. 1.
To begin with, as shown therein, there are provided a data generator 100 for computing the number of getting-off and getting-on passenger at each floor and for generating corresponding data, a learning reference value generating unit 101 for generating a reference value so as to determine the learning type and level in accordance with the data outputted from the data generator 100, a data learning unit 102 for varying the getting-off and getting-on information obtained by the data generator 100 in accordance with a reference value with respect to the learning type and level provided from the learning reference value generator 101, a traffic flow data base 103 for storing the data corresponding to the getting-off and getting-on passenger obtained by the data learning unit 102, and a traffic flow prediction unit 104 for computing the traffic flow indicating the distribution of the getting-off and getting-on passenger at each floor using the information stored in the traffic flow data base 103 when a predetermined car call is requested.
The operation of the conventional apparatus for computing the number of a waiting passenger using a statistic data will now be explained.
To begin with, the data generator 100 computes the number of the getting-off and getting-on passenger at each floor and outputs an information data to the data learning unit 102. The data learning unit 102 determines the learning type and level by receiving a reference information provided by the learning reference table of the learning reference value generator 101, and varies the information with respect to the getting-off and gettingon passenger of the data generator 100 in accordance with the learning type and level, and outputs the number of getting-off and getting-on passenger to the traffic data base 103.
At this time, when a predetermined car call is requested by a user, the traffic flow prediction unit 104 computes the distribution of the getting-off and getting-on passenger until a proper car is allocated with respect to the call using the information of the number of the getting-off and getting-on passenger stored in the traffic flow data base 103.
However, the conventional apparatus for computing the number of a waiting passenger using a statistic data has disadvantages in that in case that a user pushes the car call button, the traffic flow should be predicted for a corresponding car allocation. That is, the conventional apparatus is directed to use the statistically accumulated data, so that it is impossible to quickly allocate a corresponding car. Therefore, in this case the real time traffic flow prediction system using a vision system is necessary.
Fig. 2 shows a conventional apparatus for computing the number of a waiting passenger using a vision system. As shown therein, there are provided a camera group 200 consisting of a plurality of cameras for picturing a waiting passenger, a camera controller 201 for converting the signals outputted from the camera group 200 into the pictorial information and for transmitting the signals to the communication bus 205, a program ROM 202 for storing a program for computing the getting-off and getting-on passenger using the pictorial information outputted from the camera controller 201, a central processing unit 203 for processing the pictorial information in accordance with the program of the program ROM 202 and for transmitting the data with respect to the number of the getting-off and getting-on passenger to the communication bus 205, and a data RAM 204 for storing the data with respect to the number of a waiting passenger obtained by the central processing unit 203 and the information corresponding to the location of the camera group 200.
Fig. 3 shows a flow chart of the central processing unit 203 of Fig. 2. As shown therein, there are provided an input picture analyzing step which analyzes the pictorial information using the information corresponding to the location of the camera group 200 such as camera's installation height and distance when the pictorial information is inputted thereto from the camera group 200, a threshold comparison step which removes the remaining pixels except the pixels which its brightness exceeds a threshold in the pictorial information analyzed by the input picture analyzing step, a partition mode test step which converts the brightness information by the brightness difference of the picture obtained by the threshold comparison step into the information by a distance, and partitions and integrates the pictorial regions using the distance information, a head certainty computation step which analyzes that the integrated and partitioned regions are similar to the shape of human's head, a region number computation step which counts the number of the region in which the head certainty level exceeds a predetermined level, and a passenger number computation step which determines the number of the computed region as the number of a waiting passenger.
The operation of an apparatus for computing the number of a waiting passenger using a vision system will now be explained.
To begin with, when a user pushes a car call button, the camera group 200 pictures a waiting passenger at a corresponding floor and outputs the pictorial signals to the camera controller 201. The camera controller 201 outputs the pictorial signals to the central processing unit 203 through the communication bus 205. The central processing unit 203 analyzes the pictorial signals of the camera controller 201 outputted from the communication bus 205 and extracts the number of a waiting passenger in the same order as in a flow chart of Fig. 3.
That is, when the pictorial signals are outputted to the central processing unit 203 through the communication bus 205, the central processing unit 203 analyzes the pictorial signals based on the information corresponding to the location of each camera such as the camera's installation height and distance. Thereafter, the central processing unit 203 compares the brightness of the pictorial signals with a threshold, and removes the pixels located below the face region. In the partition mode test step, the pixels exceeding the threshold is obtained and converted into the brightness information based on the brightness difference, and the region of the picture is partitioned or integrated using the distance information.
In more detail, the distance information is obtained by an expression of M(A,B) , :1(A) - I(B),, where I(A) denotes the brightness information with respect to a region A, and I(B) denotes the brightness information with respect to a region B adjacent to the region A. At this time, if the distance between two regions is small, that is, if the brightness difference is small, two regions are integrated each other. In addition, if the distance between two regions is far, that is, the brightness difference is high, two regions A and B are partitioned each other. Thereafter, the pictorial signals based on the brightness information is distinguished from a plurality of regions based on the distance information, and the distinguished regions are transmitted to the head certainty level computation step.
In the head certainty computation step, the certainty of a head shape is computed using the brightness difference and the circular shape similar to the head shape with respect to the hair region and a skin region. Thereafter, in the region number computation step, the number of the region in which the head certainty level for each region obtained by the head certainty level computation exceeds a predetermined level. In addition, in the region number computation step, the number of the region counted by the passenger number extracting step is counted. Thereafter, the number of the region is recognized as the number of the region counted. That is, there are more than two regions in which the head certainty level exceeds a predetermined level.
However, the conventional apparatus for computing the number of a waiting passenger using a vision system has disadvantages in that it is impossible to correctly picture the waiting passenger because the elevator's installation height is relatively high.
In addition, it is difficult to correctly picture a waiting passenger because an over lapping of pictures occurs, so that more correctly computing the number of a waiting passenger is not easy due to the overlapping of the pictures. Moreover, since the conventional art uses the brightness information for computing the number of a waiting passenger, it is impossible to more correctly picture the waiting passenger at each floor.
SUMMARY OF THE INVENTION Accordingly, it is an object of the present invention to provide a method of computing the number of a waiting passenger for an elevator, which overcome the problems encountered in a conventional method of computing the number of a waiting passenger for an elevator.
It is another object of the present invention to provide a method of computing the number of a waiting passenger for an elevator using a vision system capable of computing the number of a waiting passenger using a computer vision system.
To achieve the above objects, in accordance with a first embodiment of the present invention, there is provided a method of computing the number of a waiting passenger for an elevator using a vision system, which includes the steps of a partition mode test step which computes a distance information using a brightness difference between pictures in accordance with an input picture outputted from a camera provided at a predetermined portion of a floor near an elevator and a reference picture pictured when there is no waiting passenger, and partitions and integrates the regions of an input picture and in accordance with the distance information; a head certainty level computation step which computes a head certainty level using a certainty level with respect to a predetermined part of the body of a waiting passenger in a region obtained by the partition mode test step; and a region number computation step which counts the number of a region in which a head certainty level exceeds a predetermined level and judges the number of the regions as the number of a waiting passenger.
In accordance with a second embodiment of the present invention, there is provided a method of computing the number of a waiting passenger for an elevator using a vision system, which includes the steps of a partition mode test step which computes a distance information using a brightness difference of pictures in accordance with an input picture outputted from a camera provided at a predetermined portion of a floor near an elevator and a reference picture pictured when there is no waiting passenger, and partitions and integrates the regions of an input picture and in accordance with the distance information; a head certainty level computation step which computes a head certainty level by providing a certainty level of a predetermined part of a waiting passenger to partitioned and integrated region; a geometric feature extracting step which computes the number of a pixel of a head part and the number of a pixel of a body which are obtained by geometrically modeling the location of a camera and the shape of a waiting passenger; a body region removing step which removes all regions except the region in which the head certainty level exceeds a predetermined level using the number of a pixel of a body part obtained by the geometric feature extracting step; and a pixel number computation step which computes the number of a waiting passenger by computing the number contained in the region obtained by the body region removing step and the number of a pixel modeled by the geometric feature extracting step.
In accordance with a third embodiment of the present invention, there is provided a method of computing the number of a waiting passenger for an elevator using a vision system, which includes the steps of a partition mode test step which computes a distance information using a brightness difference of pictures in accordance with an input picture outputted from a camera provided at a predetermined portion of a floor near an elevator and a reference picture pictured when there is no waiting passenger, and partitions and integrates the regions of an input picture and in accordance with the distance information; a geometric feature extracting step which computes the number of a pixel of a head part and the number of a pixel of a body which are obtained by geometrically modeling the location of a camera and the shape of a waiting passenger, a head certainty level computation step which computes a head certainty level based on the certainty level of a predetermined part of a waiting passenger; a region number computation step which computes the number of a region in which a head certainty level exceeds a predetermined level; a body region removing step which removes all regions except the region in which the head certainty level exceeds a predetermined level using the number of a pixel of a body part obtained by the geometric feature extracting step; a pixel number computation step which computes the number contained in the region obtained by the body region removing step and the number of a pixel of a hear region obtained by the geometric feature extracting step; and a waiting passenger number computation step which computes the number of a waiting passenger by computing the values obtained by the pixel number computation step and the values obtained by the region number computation step.
BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is a block diagram of a conventional apparatus for computing the number of a waiting passenger for an elevator based on a statistic data.
Fig. 2 is a block diagram of a conventional apparatus for computing the number of a waiting passenger for an elevator based on a vision system.
Fig. 3 is a flow chart of a process of computing the number of a waiting passenger for an elevator using a central processing unit of Fig. 2.
Fig. 4 is a block diagram of an apparatus for computing the number of a waiting passenger for an elevator according to the present invention.
Fig. 5 is a flow chart of Fig. 4.
Fig. 6 is a view of a process of a partition mode test step using a pictorial difference between an input picture and a reference picture according to the present invention.
Fig. 7 is a flow chart of Fig. 5.
DETAILED DESCRIPTION OF THE INVENTION The construction of an apparatus for computing the number of a waiting passenger for an elevator using a vision system will now be explained with reference to Figs. 4 and 5.
To begin with, as shown therein, there are provided a camera group 300 including a plurality of cameras and each provided at a predetermined location of each floor for picturing a waiting passenger, a camera selecting unit 301 for selectively outputting a pictorial signal outputted from the camera group 300, and a vision system 302 for computing the number of a waiting passenger by converting the pictorial signals outputted from the camera selection unit 301 into a predetermined pictorial signal, and by transmitting the signals to a television monitor, a display unit, and a group management control system (not shown).
Fig. 5 shows a flow chart of Fig. 4. As shown therein, a method of computing the number of a waiting passenger using a vision system includes an input picture receiving step which receives an input picture by picturing the waiting passenger before an elevator, a reference picture receiving step which receives the reference picture obtained by picturing the same location with no waiting passenger, a picture comparison step which receives a difference picture between the input picture and the reference picture, a partition mode test step which converts the input picture into a pictorial signal based on the brightness information using the difference picture obtained by the picture comparison step and the brightness information of the input picture, a head certainty level computation step which provides a certainty level with respect to a body, a face, a head, and a hair to each region obtained by the partition mode test step and computes a head certainty level using the surface of the region, the ratio of each corresponding region, and the brightness difference, a fuzzy relaxation step which increases the head certainty level obtained by the head certainty level step using the relationship between the body and the head, a region number computation step of counting the number of the region in which a head certainty level obtained by the fuzzy relaxation step exceeds a predetermined level, a geometric feature extracting step of computing the number of a head pixel in the current region by modeling the location and distance of the corresponding camera and the size of the region of a waiting passenger and the number of a body pixel, a body region removing step of removing the body region from the pictorial signals using the information obtained by the fuzzy relaxation step and the information obtained by the geometric feature extracting step, a pixel number computation step of computing the pixel number of the region obtained by the body region removing step and the pixel number obtained by the geometric feature extracting step, and a waiting passenger computation step of computing a waiting passenger using the pixel number obtained by the pixel number computation step and the number of the region obtained by the region number computation step.
The operation of the present invention will now be explained.
To begin with, when a passenger pushes a predetermined cal call button, the camera group 300 including a plurality of cameras provided at each floor pictures a waiting passenger and transmits the pictures to the camera selection unit 301. The camera selection unit 301 selects the pictorial signals to the vision system including a central processing unit. The vision system 302 receives the input pictures through a communication bus. In addition, in case that there is no waiting passenger, the vision system 302 receives a reference picture from the camera selection unit 301. Here, the input picture and the reference picture are taken at the same location. The vision system 302 computes the number of a waiting passenger, following the flow chart shown in Figs.
5 and 7.
The vision system 302 computes the difference picture between the input picture and the reference picture as an absolute value so as to distinguish a waiting passenger's shape from the background picture as shown in Fig. 6, so that the information whether or not the brightness of a predetermined picture is brighter can be neglected. It is because there may be any error data during a computation of the number of a waiting passenger.
Therefore, the present invention is directed to utilize the difference information in dimmer pictures. When the difference picture is obtained, the partition or integration of the regions of a picture are executed by comparing the brightness information and the difference information during the partition mode test step. That is, the partition mode test step, as shown in Fig. 6, is directed to compute the difference information between the input picture and the reference picture with respect to all pixels of the input picture and the reference picture and partition and integrate the corresponding regions of pictures in accordance with the difference picture.That is, as shown in Fig. 6 and Fig. 7, in the partition mode test step, the absolute value of the brightness difference A' of neighboring pixels 1P1 and TP2 is obtained until the brightness information and the difference information with respect to all pixels are obtained. That is, the brightness difference A' of two pixels IP1 and IP2 in the input picture is obtained in a form of ,I(P1) - I(P2)1.
Here, I(P1) denotes a brightness of a pixel(IPl) in an input picture, and I(P2) is a brightness of a pixel IP2. Thereafter, the brightness difference B' between a pixel of an input picture and a pixel of a reference picture, which are at the same location, is obtained as an absolute value. That is, the difference DI(PI) between the brightness of the pixel IP1 of the input picture and the brightness of the pixel CPl of the reference picture is obtained, and the difference DI(P2) between the brightness of the pixel (IP2) of the input picture and the brightness of the pixel(CP2) of the reference picture is obtained, so that the brightness difference B' is obtained.That is, the brightness difference B' between the input picture and the reference picture at the same location is obtained by an expression of DI(Pl) - DI(P2),, where DI(Pl) denotes the brightness difference between the pixel IP1 of the input picture and the pixel CPl of the reference picture, and DI(P2) denotes the brightness difference between the input picture pixel IP2 and the reference picture pixel CP2.Thereafter, the distance information M(Pl, P2) is obtained by an expression of ,I(P1) - I(P2)' + (1- ),DI(P1) - DI(P2)', where I(Pl) and I(P2) denote the brightness of neighboring pixels IP1 and IP2 in an input picture, and DI(Pl) a DI(P2) denote brightness differences between the pixels IP1 and IP2 in an input picture and pixels CPl and CP2 of a reference picture at the same location as the corresponding input pixels IP1 and IP2, and < x denotes a predetermined coefficient for controlling the brightness difference between pixels in an input picture and the brightness difference between an input picture and a reference picture.
Thereafter, the distance information is compared with a predetermined distance information, and partitioned or integrated. That is, if the brightness difference between two pixels IP1 and IP2 is large, the two pixels IP1 and IP2 are recognized as different regions and partitioned from each other. In addition, if the brightness difference therebetween is small, the two pixels IP1 and IP2 are recognized as the same region and integrated each other. As a result, the fist through sixth regions Al through A6 as shown in Fig. 6 can be obtained.
In the head certainty level computation step, as shown in Fig. 6, a hair certainty level A, a face certainty level B, a head certainty level C, and a body certainty level D with respect to all regions Al through A6 are obtained. That is, finally the head certainty level C is obtained using above-mentioned levels A through D.
In the head certainty level computation step, a hair certainty level is obtained using the surface of each region, the ratio of X and Y directions, and the brightness difference with respect to the first through sixth regions Al through A6. Here, the first and second regions Al and A2 as shown in Fig. 6 has a relatively high hair certainty level A. In addition, the face certainty level B is obtained by computing the regions located near and below the first and second regions Al and A2. In Fig. 6, the third and fourth regions A3 and A4 have a relatively high face certainty level B. Thereafter, the head certainty level C is obtained by adding the hair certainty level A and the face certainty level B.That is, the head certainty level C is obtained in the first and third regions A 1 and A3 and the second and fourth regions A2 and A4, respectively. In addition, thereafter, the body certainty level D is obtained by computing the fifth and sixth regions AS and A6 which are located near and below the third and fourth regions A3 and A4. Thereafter, the abovementioned certainty levels are transmitted to the fuzzy relaxation step.
In the fuzzy relaxation step, the head certainty level C obtained with respect to the first through sixth regions Al through A6 is increased up to a predetermined level. That is, a certainty level D' is obtained by assuming the first and third regions A 1 and A3 and the second and fourth regions A2 and A4 as a head region, and the fifth and sixth regions A5 and A6 as a body region. If the certainty level D' is high, the first and third regions Al and A3 are computed again so as to increase the head certainty level C by a predetermined level. The fuzzy relaxation step is finished with respect to all regions, the result is transmitted to the body region removing step and the region number computation step.
In the region number computation step, the number N' of the region in which the head certainty level C exceeds a predetermined level is counted. At this time, the number of a waiting passenger can be computed with the region number N' obtained by the region number computation step. That is, if the region number N' is 2, the number of the waiting passenger is 2.
However, so as to more correctly compute the number of a waiting passenger, it is necessary to adopt a geometric method. That is, in the geometric feature extracting step, the pixel number P' corresponding to the head region in the current region by modeling and the pixel number of the body region is provided to the body region removing step.
In more detail, the fifth and sixth regions A5 and A6 corresponding to the body region except the first and third regions Al and A3 and the second and fourth regions A2 and A4, in which the head certainty level C obtained by the fuzzy relaxation step in accordance with the pixel number provided by the geometric feature extracting step exceeds a predetermined level, are removed.
That is, the pixel number P of the first and third regions A 1 and A3 and the second and fourth regions A2 and A4 corresponding to the head region obtained after the body region removing regions A 1 and A3 is obtained in the pixel number computation step. In addition, the result N" is obtained by an expression of:
where N denotes the number of a region.
In this case, if the pixel number of a head in which the head is overlapped is 70, and if the pixel number with respect to the reference head pixel is 50, the result is 1.4, so that the number of a waiting passenger can be more correctly computed.
When the pixel number corresponding to the head region is obtained, the number of a waiting passenger can be computed. However, the final number of a waiting passenger can be computed by the following expression: The number of a waiting passenger = 6 * N' + (1-6) * N", where 6 is the weight with respect to an average tall of a person. The central processing unit displays the result on a result display unit, a cable television monitor, and a group management control system. In addition, as shown in Fig. 4, there are provided a user's interface so as to control the values such as values tx, 6, and camera coefficient.
As described above, a method of computing the number of a waiting passenger for elevator using a vision system according to the present invention is directed to correctly compute the number of a waiting passenger by computing the number of the region in which a head certainty level is high and by dividing the pixel number of the head region in which the head certainty level is high by the pixel number of the head region obtained by the geometric feature extracting step and by using the number of the region in which the certainty level is high and the divided value. Therefore, even in case that the head region is overlapped, the number of a waiting passenger can be correctly computed in the present invention. In addition, a passenger passing through the space in front of the door of an elevator is neglected from the computation of the number of a waiting passenger. Moreover, a passenger who stays at the space in front of the door of an elevator for a predetermined time can also be neglected from the computation of the number of a waiting passenger for more correct computation. In addition, a tree or other objects near the door of an elevator can also be neglected in the present invention.

Claims (10)

What is claimed is:
1. A method of computing the number of a waiting passenger for an elevator using a vision system comprising the steps of: a partition mode test step which computes a distance information using a brightness difference between pictures in accordance with an input picture outputted from a camera provided at a predetermined portion of a floor near an elevator and a reference picture pictured when there is no waiting passenger, and partitions and integrates the regions of an input picture and in accordance with said distance information; a head certainty level computation step which computes a head certainty level using a certainty level with respect to a predetermined part of the body of a waiting passenger in a region obtained by said partition mode test step; and a region number computation step which counts the number of a region in which a head certainty level exceeds a predetermined level and judges the number of the regions as the number of a waiting passenger.
2. The method of claim 1, wherein said difference picture is directed to obtain an absolute value using an input picture pictured when there is at least one waiting passenger at a predetermined floor and a reference picture pictured when there is no waiting passenger at said predetermined floor.
3. The method of claim 1, wherein said partition mode test step includes the steps of; a first step which computes an absolute value using a brightness difference between neighboring pixels in an input picture; a second step which computes an absolute value using a brightness difference between an input picture and a reference picture at the same location; a third step which computes a distance information using the values obtained by said first step and said second step; a fourth step which compares said distance information with a predetermined value; and a fifth step which integrates two pixels when the distance information is smaller than a predetermined level and partitions when the distance information is greater than said predetermined level.
4. The method of claim 3, wherein said distance information is obtained based on an expression of II(P1) - I(P2)1 + (1-=)lDI(Pl) - DI(P2),, where I(P1) and I(P2) denote the brightness of neighboring pixels IP1 and IP2 in an input picture, and DI(P 1) and DI(P2) denote brightness differences between said pixels IP1 and IP2 in an input picture and pixels CPl and CP2 of a reference picture at the same location as the corresponding input pixels IP1 and IP2, and oc denotes a predetermined coefficient for controlling the brightness difference between pixels in an input picture and the brightness difference between an input picture and a reference picture.
5. The method of claim 1, wherein said head certainty level computation step includes the steps of; a first step which computes a hair certainty level using an integrated and partitioned region obtained by the partition mode test step; a second step which computes a face certainty level using a region below and adjacent to the region of said hair certainty level; a third step which computes a head certainty level using an integration of the region of said hair certainty level and the region of said face certainty level; and a fourth step which computes a body certainty level using the region except the region of said head certainty level.
6. The method of claim 5, wherein said third step includes a sub-step which increases the head certainty level by recognizing the region in which the head certainty level is high as a head part, and the region located below said region as a body part.
7. A method of computing the number of a waiting passenger for an elevator using a vision system comprising the steps of: a partition mode test step which computes a distance information using a brightness difference of pictures in accordance with an input picture outputted from a camera provided at a predetermined portion of a floor near an elevator and a reference picture pictured when there is no waiting passenger, and partitions and integrates the regions of an input picture and in accordance with said distance information; a head certainty level computation step which computes a head certainty level by providing a certainty level of a predetermined part of a waiting passenger to partitioned and integrated region;; a geometric feature extracting step which computes the number of a pixel of a head part and the number of a pixel of a body which are obtained by geometrically modeling the location of a camera and the shape of a waiting passenger; a body region removing step which removes all regions except the region in which the head certainty level exceeds a predetermined level using the number of a pixel of a body part obtained by said geometric feature extracting step; and a pixel number computation step which computes the number of a waiting passenger by computing the number contained in the region obtained by said body region removing step and the number of a pixel modeled by the geometric feature extracting step.
8. The method of claim 7, wherein said number of a pixel is computed by:
where N denotes the number of a region, and P denotes the number of a head pixel obtained by a body region removing step, and P' denotes the number of a head pixel obtained by modeling through a geometric feature extracting step.
9. A method of computing the number of a waiting passenger for an elevator using a vision system comprising the steps of: a partition mode test step which computes a distance information using a brightness difference of pictures in accordance with an input picture outputted from a camera provided at a predetermined portion of a floor near an elevator and a reference picture pictured when there is no waiting passenger, and partitions and integrates the regions of an input picture and in accordance with said distance information; a geometric feature extracting step which computes the number of a pixel of a head part and the number of a pixel of a body which are obtained by geometrically modeling the location of a camera and the shape of a waiting passenger;; a head certainty level computation step which computes a head certainty level based on the certainty level of a predetermined part of a waiting passenger; a region number computation step which computes the number of a region in which a head certainty level exceeds a predetermined level; a body region removing step which removes all regions except the region in which the head certainty level exceeds a predetermined level using the number of a pixel of a body part obtained by said geometric feature extracting step; a pixel number computation step which computes the number contained in the region obtained by said body region removing step and the number of a pixel of a hear region obtained by the geometric feature extracting step; and a waiting passenger number computation step which computes the number of a waiting passenger by computing the values obtained by said pixel number computation step and the values obtained by the region number computation step.
10. The method of claim 9, wherein said waiting passenger number computation step is directed to compute the number of a waiting passenger by 6 * N' + (1-6) * N", where N' denotes the value obtained by the region number computation step, and N" denotes the value obtained by the pixel number computation step, and 6 denotes a predetermined weight.
GB9520604A 1994-10-10 1995-10-09 Method of computing number of waiting passengers for elevator Expired - Fee Related GB2294114B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1019940025907A KR960013086B1 (en) 1994-10-10 1994-10-10 Elevator wait passenger number detecting method

Publications (3)

Publication Number Publication Date
GB9520604D0 GB9520604D0 (en) 1995-12-13
GB2294114A true GB2294114A (en) 1996-04-17
GB2294114B GB2294114B (en) 1999-05-05

Family

ID=19394792

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9520604A Expired - Fee Related GB2294114B (en) 1994-10-10 1995-10-09 Method of computing number of waiting passengers for elevator

Country Status (4)

Country Link
JP (1) JP2599701B2 (en)
KR (1) KR960013086B1 (en)
CN (1) CN1047358C (en)
GB (1) GB2294114B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006000495A1 (en) * 2006-09-28 2008-04-03 Vis-à-pix GmbH Automated equipment management system for control of controllable equipment of system, has automated image based recorder unit, and processing unit that is connected with recorder unit and controlled device
US7362370B2 (en) * 2002-01-22 2008-04-22 Fujifilm Corporation Image capturing apparatus, image capturing method, and computer-readable medium storing program using a distance measure for image correction
EP2546807A3 (en) * 2011-07-11 2013-05-01 Optex Co., Ltd. Traffic monitoring device
CN105224992A (en) * 2014-05-28 2016-01-06 国际商业机器公司 To waiting for the method and system predicted of ridership and evaluation method and system

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3251228B2 (en) * 1998-03-31 2002-01-28 株式会社エヌ・ティ・ティ ファシリティーズ Elevator control method and device
JP4318465B2 (en) 2002-11-08 2009-08-26 コニカミノルタホールディングス株式会社 Person detection device and person detection method
KR100749889B1 (en) * 2005-07-04 2007-08-16 정제창 System and method for counting audience number
CN100462295C (en) * 2006-09-29 2009-02-18 浙江工业大学 Intelligent dispatcher for group controlled lifts based on image recognizing technology
JP2009143722A (en) * 2007-12-18 2009-07-02 Mitsubishi Electric Corp Person tracking apparatus, person tracking method and person tracking program
CN101723208B (en) * 2009-04-02 2012-08-08 浙江大学 Method and system for optimal lift allocation in commercial and residential multifunctional building
KR101220333B1 (en) * 2010-05-13 2013-01-09 엘지이노텍 주식회사 Apparatus and method for controlling elevator
WO2016110866A1 (en) * 2015-01-07 2016-07-14 Sai Narayanan Hari Intelligent elevator management system using image processing
JP6339518B2 (en) * 2015-03-31 2018-06-06 株式会社日立製作所 Installation number calculation device and calculation method of destination floor registration device
JP6963464B2 (en) * 2017-10-30 2021-11-10 株式会社日立製作所 Elevator boarding / alighting number estimation device, boarding / alighting number estimation method and boarding / alighting number estimation program
CN113135478B (en) * 2021-05-08 2023-03-21 上海三菱电梯有限公司 Elevator internal visitor permission management method, readable storage medium and computer equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2230622A (en) * 1989-03-20 1990-10-24 Hitachi Ltd Lift control system
GB2243906A (en) * 1990-03-02 1991-11-13 Hitachi Ltd Image processing apparatus
US5298697A (en) * 1991-09-19 1994-03-29 Hitachi, Ltd. Apparatus and methods for detecting number of people waiting in an elevator hall using plural image processing means with overlapping fields of view

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2230622A (en) * 1989-03-20 1990-10-24 Hitachi Ltd Lift control system
GB2243906A (en) * 1990-03-02 1991-11-13 Hitachi Ltd Image processing apparatus
US5298697A (en) * 1991-09-19 1994-03-29 Hitachi, Ltd. Apparatus and methods for detecting number of people waiting in an elevator hall using plural image processing means with overlapping fields of view

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7362370B2 (en) * 2002-01-22 2008-04-22 Fujifilm Corporation Image capturing apparatus, image capturing method, and computer-readable medium storing program using a distance measure for image correction
DE102006000495A1 (en) * 2006-09-28 2008-04-03 Vis-à-pix GmbH Automated equipment management system for control of controllable equipment of system, has automated image based recorder unit, and processing unit that is connected with recorder unit and controlled device
EP2546807A3 (en) * 2011-07-11 2013-05-01 Optex Co., Ltd. Traffic monitoring device
US8873804B2 (en) 2011-07-11 2014-10-28 Optex Co., Ltd. Traffic monitoring device
CN105224992A (en) * 2014-05-28 2016-01-06 国际商业机器公司 To waiting for the method and system predicted of ridership and evaluation method and system

Also Published As

Publication number Publication date
JP2599701B2 (en) 1997-04-16
CN1132718A (en) 1996-10-09
GB2294114B (en) 1999-05-05
GB9520604D0 (en) 1995-12-13
CN1047358C (en) 1999-12-15
JPH08127476A (en) 1996-05-21
KR960013967A (en) 1996-05-22
KR960013086B1 (en) 1996-09-30

Similar Documents

Publication Publication Date Title
GB2294114A (en) System for counting the number of people waiting for a lift
US7321668B2 (en) Object detection apparatus, object detection method and computer program product
EP0756426B1 (en) Specified image-area extracting method and device for producing video information
KR100490972B1 (en) Method and device for real-time detection, location and determination of the speed and direction of movement of an area of relative movement in a scene
US7305106B2 (en) Object detection apparatus, object detection method and recording medium
US6173069B1 (en) Method for adapting quantization in video coding using face detection and visual eccentricity weighting
KR101803474B1 (en) Device for generating multi-view immersive contents and method thereof
EP2068569A1 (en) Method of and apparatus for detecting and adjusting colour values of skin tone pixels
GB2328504A (en) Eye position detector
CN110713082B (en) Elevator control method, system, device and storage medium
CN106875371A (en) Image interfusion method and image fusion device based on Bayer format
JP7099809B2 (en) Image monitoring system
CN108460319B (en) Abnormal face detection method and device
CN113269111B (en) Video monitoring-based elevator abnormal behavior detection method and system
CN111137761A (en) Face recognition elevator false triggering prevention method and device and storage medium
CN116994313A (en) Mask detection system and method based on edge calculation under elevator scene
JPH08249471A (en) Moving picture processor
CN116486383A (en) Smoking behavior recognition method, smoking detection model, device, vehicle, and medium
JP2000053361A (en) Passenger monitoring device for man-conveyer
JPH04174309A (en) Driver&#39;s eye position detecting apparatus and condition detecting apparatus
CN115108426A (en) Elevator control method, elevator control device, computer equipment and readable storage medium
JP2022190504A (en) Image analysis device and monitoring system
JPH11286377A (en) Elevator control method and its device
CN112158695A (en) Elevator control method
US20220405956A1 (en) Monitoring system

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20001009