CN103964271B - Elevator monitoring arrangement and elevator monitoring method - Google Patents

Elevator monitoring arrangement and elevator monitoring method Download PDF

Info

Publication number
CN103964271B
CN103964271B CN201410017454.XA CN201410017454A CN103964271B CN 103964271 B CN103964271 B CN 103964271B CN 201410017454 A CN201410017454 A CN 201410017454A CN 103964271 B CN103964271 B CN 103964271B
Authority
CN
China
Prior art keywords
passenger
car
height
elevator monitoring
image pickup
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410017454.XA
Other languages
Chinese (zh)
Other versions
CN103964271A (en
Inventor
高桥一哉
酒井亮一
国贞拓也
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Hitachi Building Systems Co Ltd
Original Assignee
Hitachi Ltd
Hitachi Building Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd, Hitachi Building Systems Co Ltd filed Critical Hitachi Ltd
Publication of CN103964271A publication Critical patent/CN103964271A/en
Application granted granted Critical
Publication of CN103964271B publication Critical patent/CN103964271B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Indicating And Signalling Devices For Elevators (AREA)
  • Image Processing (AREA)
  • Elevator Control (AREA)

Abstract

提供电梯监视装置和方法,对轿厢内朝斜下方设置的摄像部的图像进行处理,进行轿厢的拥挤度判定。使用对电梯的轿厢内进行拍摄的朝斜下方设置的摄像部所拍摄到的图像,以轿厢内的已知的高度的记号为基准来测量搭乘者的一部分的高度;基于由摄像部拍摄到的图像在轿厢内跟踪搭乘者;将图像中的搭乘者的一部分在画面上的位置作为输入,计算从摄像部将搭乘者的一部分投影至轿厢的地板时从正上方观察时的投影位置;基于摄像部的高度、搭乘者的上述一部分的高度及摄像部的位置与投影位置的位置关系,计算从正上方俯视时的搭乘者的站立位置;基于搭乘者的站立位置估计从正上方俯视轿厢内时的全部搭乘者所占的占有面积;并基于占有面积判定轿厢的拥挤度。

Provided are an elevator monitoring device and method for processing images of an imaging unit installed obliquely downward in a car to determine the degree of congestion of the car. Use the image captured by the imaging unit installed obliquely downward to take pictures of the inside of the elevator car, and measure the height of a part of the passengers based on the known height mark in the car; based on the image taken by the imaging unit The acquired image tracks the occupant in the car; the position of a part of the occupant in the image on the screen is used as input, and the projection of the part of the occupant when viewed from directly above when projected from the camera unit to the floor of the car is calculated Position: Based on the height of the imaging unit, the height of the above-mentioned part of the occupant, and the positional relationship between the position of the imaging unit and the projection position, calculate the standing position of the occupant when looking down from directly above; estimate the position from directly above based on the standing position of the occupant The occupied area occupied by all passengers when looking down on the inside of the car; and the degree of congestion of the car is determined based on the occupied area.

Description

电梯监视装置及电梯监视方法Elevator monitoring device and elevator monitoring method

技术领域 technical field

本发明涉及电梯监视装置以及电梯监视方法。 The present invention relates to an elevator monitoring device and an elevator monitoring method.

背景技术 Background technique

在电梯轿厢内,将因乘客或行李等而拥塞的无余地的状态称为满员状态。另外,即使并非满员,也能按照拥塞的余地的宽度来对拥挤状态进行定量化。而且,在专利文献1中记载了在轿厢内的上部或者侧板设置俯视正下方的摄像头,对乘客或者行李所致的地板的占有面积的变化进行测量(段落0006)。另外,在专利文献2中记载了一种装置,其根据以俯视正下方的轿厢内摄像头拍摄出的图像来测量在人物搭乘时人物高度,并基于所述人物高度来校正与通过背景差分而检测出的乘客等相符的面积(段落0043~0049)。另外,在专利文献3中记载了根据设置于轿厢内的立体摄像头的至物体为止的距离信息以及摄像头的配置信息来测量被摄体的高度和存在位置(段落0039~0042)。另外,在专利文献4中记载了采用斜下方地俯瞰道路的单一摄像头来监视车辆的装置以及方法。在此,基于预先确定的大型车辆和小型车辆的典型的大小,进行了以透视投影而求出的车辆位置的校正。 In an elevator car, a state in which there is no room for congestion due to passengers, luggage, etc. is called a full state. In addition, even if it is not full, the congestion state can be quantified according to the width of the congestion room. Furthermore, Patent Document 1 describes that a camera that looks down directly below is installed on the upper portion or side panel of the car to measure changes in floor area occupied by passengers or luggage (paragraph 0006). In addition, Patent Document 2 describes a device that measures the height of a person when the person is riding on the basis of the image captured by the camera in the car looking down directly below, and corrects the height of the person based on the height of the person. The area where the detected passengers match (paragraphs 0043 to 0049). Also, Patent Document 3 describes measuring the height and existing position of a subject based on distance information to an object and camera arrangement information of a stereo camera installed in a car (paragraphs 0039 to 0042). In addition, Patent Document 4 describes a device and method for monitoring a vehicle using a single camera that overlooks a road obliquely below. Here, based on predetermined typical sizes of a large vehicle and a small vehicle, the vehicle position obtained by perspective projection is corrected.

另外,在电梯轿厢设置防范用的摄像头是一般方式,寻求利用该防范摄像头所获取的图像来进行轿厢的满员判定或拥挤度测量。防范摄像头多朝向搭乘口而设置于轿厢的深处角落。 In addition, it is a common method to install a security camera in the elevator car, and it is sought to use the image acquired by the security camera to determine the fullness of the car or measure the degree of congestion. The security cameras are mostly set at the deep corner of the car facing the boarding entrance.

先行技术文献 Prior art literature

专利文献 patent documents

专利文献1:日本特开2006—240825号公报 Patent Document 1: Japanese Patent Laid-Open No. 2006-240825

专利文献2:日本特开平8—26611号公报 Patent Document 2: Japanese Patent Application Laid-Open No. 8-26611

专利文献3:日本特开2001—34883号公报 Patent Document 3: Japanese Patent Laid-Open No. 2001-34883

专利文献4:日本特开2008—299458号公报 Patent Document 4: Japanese Patent Laid-Open No. 2008-299458

发明要解决的课题 The problem to be solved by the invention

使用从电梯轿厢的天花板朝正下方俯瞰地板的摄像头,例如如专利文献1那样,测量搭乘者对地板的占有面积是较容易的。然而,在从轿厢天花板角落俯瞰斜下方的防范摄像头所获取的图像中,外形上,人物所占的地板面积测量得比实际的占有面积大,故而存在会评价为比实际高的拥挤度、虽非满员而判定为满员这样的课题。 Using a camera looking down on the floor from the ceiling of the elevator car, as in Patent Document 1, for example, it is relatively easy to measure the floor area occupied by passengers. However, in the image captured by the security camera looking down obliquely from the corner of the car ceiling, in appearance, the floor area occupied by people is measured to be larger than the actual occupied area. Issues such as judging that it is full although it is not full.

在专利文献2中鉴于根据摄像头所拍出的乘客的面积来估计乘客数,或摄像头与被摄体的距离越近则乘客面积变大这样的事实,通过乘以与乘客的高度相应的校正系数来对乘客面积施加了校正。然而,由于未进行从斜上方俯瞰的情况下的校正,因此因乘客的站立位置而过大评价拥挤度的课题也与专利文献1相同。 In Patent Document 2, in view of the fact that the number of passengers is estimated according to the area of the passengers captured by the camera, or that the closer the distance between the camera and the object is, the area of the passengers becomes larger, by multiplying the correction coefficient corresponding to the height of the passengers, to apply a correction for passenger area. However, the problem of overestimating the degree of congestion depending on the standing position of passengers is also the same as in Patent Document 1, since correction is not performed when looking down from obliquely above.

在专利文献3中,能根据立体测距值来测量对象物的高度和存在位置。然而,除了防范摄像头之外,需要追加至少2个以上的立体测距用的摄像头,故而存在装置费用的增加以及因摄像头增加导致的轿厢内的美观下降的课题。 In Patent Document 3, the height and existing position of an object can be measured based on the stereo distance measurement value. However, in addition to the security camera, it is necessary to add at least two more cameras for stereoscopic distance measurement, so there are problems of an increase in device cost and a reduction in the appearance of the car interior due to the increase of cameras.

在专利文献4中记载了以朝斜下方俯瞰道路的单一摄像头来监视车辆的装置以及方法。在此,基于预先确定的大型车辆和小型车辆的典型的大小,来进行通过透视投影而求出的车辆位置的校正。若将此方法直接应用至电梯轿厢内,则在作为远比道路狭窄的区域的轿厢区域,即使将典型的人物高度或宽度准备几个,校正精度也是不充分的,另外,与车辆不同,存在难以判断应用哪个人物高度或宽度为好的课题。 Patent Document 4 discloses a device and method for monitoring a vehicle with a single camera overlooking a road obliquely downward. Here, the vehicle position obtained by perspective projection is corrected based on predetermined typical sizes of large vehicles and small vehicles. If this method is directly applied to the elevator car, in the car area, which is an area far narrower than the road, even if a few typical person heights or widths are prepared, the correction accuracy is not sufficient. In addition, unlike vehicles, , there is a problem that it is difficult to judge which character height or width to apply.

发明内容 Contents of the invention

为此,本发明的目的在于,提供电梯轿厢内的一般的防范摄像头那样的对俯瞰斜下方的摄像头的图像进行处理并进行轿厢的拥挤度判定的电梯监视装置以及电梯监视方法。 Therefore, an object of the present invention is to provide an elevator monitoring device and an elevator monitoring method that process images of a camera overlooking obliquely downward, such as a general security camera in an elevator car, and determine the degree of congestion of the car.

用于解决课题的手段 means to solve the problem

为了解决该课题,例如,在具备对电梯的轿厢内进行拍摄的朝向斜下方设置的摄像部的电梯监视装置中,具有:搭乘者高度测量部,其以所述轿厢内的已知的高度的记号为基准来测量搭乘者的一部分的高度;搭乘者跟踪部,其基于由所述摄像部拍摄到的图像,在所述轿厢内跟踪所述搭乘者;搭乘者投影部,其将所述图像中的所述搭乘者的所述一部分在画面上的位置作为输入,来计算从所述摄像部将所述搭乘者的所述一部分投影至所述轿厢的地板时从正上方观察时的投影位置;搭乘者的站立位置计算部,其基于所述摄像部的高度、所述搭乘者的所述一部分的高度、以及所述摄像部的位置与所述投影位置之间的位置关系,来计算从正上方俯视时的所述搭乘者的站立位置;占有面积估计部,其基于所述搭乘者的站立位置,来估计从正上方俯视所述轿厢内时的全部搭乘者所占的占有面积;以及拥挤度判定部,其基于所述占有面积,来判定所述轿厢的拥挤度。 In order to solve this problem, for example, in an elevator monitoring device provided with an imaging unit installed obliquely downward for imaging the inside of an elevator car, a passenger height measuring unit is provided based on the known inside of the car. The height mark is used as a reference to measure the height of a part of the passenger; the passenger tracking unit tracks the passenger in the car based on the image captured by the imaging unit; the passenger projection unit tracks the passenger The position on the screen of the part of the passenger in the image is used as an input to calculate the position of the part of the passenger viewed from directly above when the imaging unit projects the part of the passenger onto the floor of the car. The projected position at the time; the standing position calculation unit of the passenger, which is based on the height of the imaging unit, the height of the part of the passenger, and the positional relationship between the position of the imaging unit and the projected position , to calculate the standing position of the passenger when looking down from directly above; the occupied area estimating unit, based on the standing position of the passenger, estimates the occupancy of all passengers when looking down on the inside of the car from directly above and a congestion degree determination unit that determines the degree of congestion of the car based on the occupied area.

发明效果 Invention effect

根据本发明,能对轿厢内朝向斜下方设置的摄像部的图像进行处理,进行轿厢的拥挤度判定。此外,上述以外的课题、构成以及效果通过以下的实施方式的说明而明确。 According to the present invention, the image of the imaging unit installed obliquely downward in the car can be processed to determine the congestion degree of the car. In addition, the problems, configurations, and effects other than those described above will be clarified by the description of the following embodiments.

附图说明 Description of drawings

图1是电梯监视装置的构成图的例子。 Fig. 1 is an example of a configuration diagram of an elevator monitoring device.

图2是电梯监视装置的处理的流程的例子。 Fig. 2 is an example of the flow of processing of the elevator monitoring device.

图3是电梯监视装置中的摄像部的设置例以及摄像部中获取的图像的例子。 Fig. 3 shows an installation example of an imaging unit in an elevator monitoring device and an example of an image acquired by the imaging unit.

图4是表示电梯监视装置中的人物的投影位置与真正的站立位置的关系的图。 Fig. 4 is a diagram showing the relationship between the projected position of a person in the elevator monitoring device and the actual standing position.

图5是以朝正下方俯视轿厢的地板的朝向来表示电梯监视装置中的人物的投影位置与真正的站立位置的关系的图。 Fig. 5 is a diagram showing the relationship between the projected position of a person in the elevator monitoring device and the actual standing position in the direction of looking down on the floor of the car directly below.

图6是表示射影变换的具体例的图。 FIG. 6 is a diagram showing a specific example of projective transformation.

图7是由电梯监视装置估计轿厢的地板的占有状况的图。 Fig. 7 is a diagram in which the floor occupancy status of the car is estimated by the elevator monitoring device.

具体实施方式 detailed description

参照附图来说明本发明的实施方式。此外,在各图中,对相同或类似的构成要素赋予相同的符号,并省略说明。 Embodiments of the present invention will be described with reference to the drawings. In addition, in each figure, the same code|symbol is attached|subjected to the same or similar component, and description is abbreviate|omitted.

图1是电梯监视装置的构成图的例子。本实施方式的电梯监视装置由运算控制部1和摄像部2构成。运算控制部1具有:搭乘者高度测量部3、搭乘者跟踪部4、搭乘者投影部5、搭乘者的站立位置计算部6、占有面积估计部7、拥挤度判定部8、操作控制部9、以及输出部10。运算控制部1能使用嵌入用的图像处理装置或个人计算机等以软件来实现各功能的一部分或全部。另外,可以以集成电路等的硬件来构成运算控制部1的各功能的一部分或全部。 Fig. 1 is an example of a configuration diagram of an elevator monitoring device. The elevator monitoring device of the present embodiment is composed of an arithmetic control unit 1 and an imaging unit 2 . The calculation control unit 1 has: a passenger height measurement unit 3 , a passenger tracking unit 4 , a passenger projection unit 5 , a passenger standing position calculation unit 6 , an occupied area estimation unit 7 , a congestion degree determination unit 8 , and an operation control unit 9 , and the output unit 10 . The arithmetic control unit 1 can implement a part or all of the functions by software using an embedded image processing device, a personal computer, or the like. In addition, a part or all of the functions of the arithmetic control unit 1 may be constituted by hardware such as an integrated circuit.

操作控制部9包括能从外部控制使运算控制部1起动或停止、或初始化的用户接口。输出部10具有将运算控制部1的状态或拥挤度判定部8的判定结果输出至外部的功能。 The operation control unit 9 includes a user interface capable of externally controlling the operation control unit 1 to start, stop, or initialize. The output unit 10 has a function of outputting the state of the calculation control unit 1 or the determination result of the congestion degree determination unit 8 to the outside.

图2是表示电梯监视装置的处理的流程的例子的图。基于图2的处理的流程来说明图1所示的构成。电梯监视装置的动作的开始后立刻进行各部的初始化(步骤s01)。这是用于运算控制部1的处理的存储器或标志(flag)的初始化。在以个人计算机实现运算控制部1的情况下,存储器的确保和初始化与此相当。运算控制部1在初始化后,反复从步骤s02的图像获取起至步骤s08的拥挤度判定输出为止的处理。在从步骤s08返回至步骤s02时,在步骤s09中确认是否不存在结束的中断或电源断开,在不存在这些的情况下(是),进行上述的反复处理,在存在这些的情况下(否)结束处理。 Fig. 2 is a diagram showing an example of the flow of processing by the elevator monitoring device. The configuration shown in FIG. 1 will be described based on the flow of processing in FIG. 2 . Immediately after the start of the operation of the elevator monitoring device, each part is initialized (step s01). This is initialization of memory or flags used in the processing of the arithmetic control unit 1 . When realizing the arithmetic control unit 1 with a personal computer, securing and initializing the memory correspond to this. After the initialization, the arithmetic control unit 1 repeats the processing from the image acquisition in step s02 to the congestion degree determination output in step s08. When returning to step s02 from step s08, it is confirmed in step s09 whether there is no end interruption or power off, and if there is no such situation (Yes), the above-mentioned iterative process is carried out, and under the situation of these ( NO) to end the processing.

在说明图2的处理的流程的细节前,基于图3来说明摄像部2的设置例以及由摄像部2获取的图像的例子。图3是电梯监视装置中的摄像部的设置例以及由摄像部获取的图像的例子。图3(a)是从出入口方向观察电梯的轿厢内的图。而且,在该例子中轿厢深处的右上角落设置有防范摄像头作为摄像部2。图3(b)和图3(c)是由摄像部2获取的图像的例子。图3(b)示出了在电梯门打开时作为搭乘者的人物20搭乘的瞬间,图3(c)示出了在电梯门关闭后人物20搭乘或者滞留的状况。 Before describing the details of the flow of processing in FIG. 2 , an example of installation of the imaging unit 2 and an example of images acquired by the imaging unit 2 will be described based on FIG. 3 . Fig. 3 shows an installation example of an imaging unit in an elevator monitoring device and an example of an image acquired by the imaging unit. Fig. 3(a) is a view of the interior of the elevator car viewed from the direction of the entrance and exit. And, in this example, a security camera is provided as the imaging unit 2 at the upper right corner of the deep part of the car. FIG. 3( b ) and FIG. 3( c ) are examples of images captured by the imaging unit 2 . FIG. 3( b ) shows the moment when the character 20 boards as a passenger when the elevator door is opened, and FIG. 3( c ) shows the situation in which the character 20 boards or stays after the elevator door is closed.

在图2的处理的流程中,在搭乘者通过轿厢的出入口时,以轿厢内的已知的高度的记号(例如轿厢的出入口等)为基准,来测量该搭乘者的高度h或者高度h与宽度w(步骤s03)。也如图3(b)所示,轿厢的出入口的实际的大小能预先测量。将出入口的高度设为EH,出入口的宽度设为EW。另外,在摄像部2的设置后,能测量出入口的高度EH和宽度EW在由摄像部2获取的图像上分别相当于多少像素。在此,将出入口的高度方向的像素数设为Hpix,出入口的宽度方向的像素数设为Wpix。 In the process flow of FIG. 2 , when the passenger passes through the entrance and exit of the car, the height h or Height h and width w (step s03). Also as shown in Fig. 3(b), the actual size of the entrance and exit of the car can be measured in advance. Let the height of the entrance and exit be EH and the width of the entrance and exit be EW. In addition, after the imaging unit 2 is installed, it is possible to measure how many pixels each of the height EH and the width EW of the entrance and exit correspond to on the image acquired by the imaging unit 2 . Here, the number of pixels in the height direction of the entrance is Hpix, and the number of pixels in the width direction of the entrance is Wpix.

在步骤s03中,首先,对由摄像部2获取的图像进行图像处理,并通过由摄像部2获取的图像的帧间的差分和二值化处理来检测人物20的轮廓或者外接矩形。这样的图像处理自身是一般的手法,因此省略详细说明。而且,分别求取轮廓或者外接矩形的高度和宽度的像素数。在此,若将图像上的人物20的高度方向的像素数设为hpix,人物20的宽度方向的像素数设为wpix,则能基于如下的式(1)、式(2)的比例计算来测量人物20的高度h和宽度w。 In step s03 , first, image processing is performed on the image captured by the imaging unit 2 , and the outline or circumscribed rectangle of the person 20 is detected by inter-frame difference and binarization processing of the image captured by the imaging unit 2 . Such image processing itself is a common technique, and thus detailed description thereof will be omitted. Furthermore, the number of pixels for the height and width of the outline or the circumscribed rectangle is obtained respectively. Here, if the number of pixels in the height direction of the person 20 on the image is hpix, and the number of pixels in the width direction of the person 20 is wpix, then it can be calculated based on the ratio of the following formula (1) and formula (2). The height h and width w of the person 20 are measured.

h=EH×hpix/Hpix…(1) h=EH×hpix/Hpix...(1)

w=EW×wpix/Wpix…(2) w=EW×wpix/Wpix...(2)

步骤s03中被测量了高度h等的人物20之后立刻作为搭乘者被持续跟踪(步骤s04)。在一个接一个新的搭乘者通过所述出入口进行搭乘时,在步骤03中,以上述的方法来测量高度h等,成为步骤s04的搭乘者跟踪的对象。跟踪处理在每次获取图像(步骤s02)时,按每个人物进行前面图像中的画面上的位置与当前画面上的位置的比对,由此来进行。故而在每个从步骤s02起至步骤s08为止的循环中,实施步骤s04的搭乘者跟踪处理。另一方面,步骤s03的搭乘者的高度h的测量至少在搭乘时进行一次实施即可,因此关于测量完高度h和宽度w的人物,不进行步骤s03的任何处理而通过。在步骤s04的搭乘者跟踪中,已测量的人物的高度h等作为人物的属性而随后。因此,例如,像图3(c)的人物20那样即使在轿厢内只滞留一会儿,搭乘时已测量的人物的高度h和宽度w的信息也进行保持以能够利用。另外,步骤s04的搭乘者跟踪可以进行基于图案匹配的跟踪。此外,这样的跟踪处理自身是一般的手法,因此省略详细说明。 Immediately thereafter, the person 20 whose height h and the like are measured in step s03 is continuously tracked as a passenger (step s04 ). When new passengers board one after another through the gate, in step 03, the height h etc. are measured by the method described above, and become the object of passenger tracking in step s04. The tracking process is performed by comparing the position on the screen in the previous image with the position on the current screen for each person every time an image is acquired (step s02 ). Therefore, in each loop from step s02 to step s08, the passenger tracking process of step s04 is implemented. On the other hand, the measurement of the passenger's height h in step s03 should be carried out at least once during boarding, so the person whose height h and width w have been measured passes without performing any processing in step s03 . In the passenger tracking in step s04 , the measured height h and the like of the person are followed as attributes of the person. Therefore, for example, even if the person 20 in FIG. 3( c ) stays only for a while in the car, the information on the height h and width w of the person measured during boarding is also retained so that it can be used. In addition, the passenger tracking in step s04 can be based on pattern matching. Note that such tracking processing itself is a common technique, and thus detailed description thereof will be omitted.

在图2的处理的流程中,针对在轿厢内所跟踪的全体搭乘者,通过透视投影来进行搭乘者至地板的投影(步骤s05)以及搭乘者的站立位置计算(步骤s06)。 In the flow of the process in FIG. 2 , for all the passengers tracked in the car, the projection of the passengers to the floor is carried out by perspective projection (step s05 ) and the standing position of the passengers is calculated (step s06 ).

首先,使用图4来说明基本的思路。图4是表示电梯监视装置中的人物的投影位置与真正的站立位置的关系的图。图4(a)与图3(c)相同,是人物20滞留的图像。在此图像中,通过图案匹配等从搭乘时起持续跟踪,因此测量人物的高度h时的头顶部在图4(a)中可知,位于A所示出的位置。为此,头顶部通过透视投影被投影至图4(b)的投影位置B。而且,能计算从摄像部2的正下方至投影位置B的距离D。另外,从地板到摄像部2的高度H能在设置后进行测量,因此若在本件处理之前进行预先测量,则能设为已知的值。而且,在将头顶部A投影至地板后,距离D和高度H以及搭乘时测量而跟踪保持了的高度h成为已知。在此想要求取的数值是真正的站立位置,即,从摄像部2的正下方到站立位置的距离d。而且,距离d能基于如下的式(3)的比例计算来计算。 First, the basic idea will be described using FIG. 4 . Fig. 4 is a diagram showing the relationship between the projected position of a person in the elevator monitoring device and the actual standing position. FIG. 4( a ) is the same as FIG. 3( c ), and is an image of a person 20 staying. In this image, since the tracking is continued from the time of boarding by pattern matching, etc., the top of the head when the height h of the person is measured is found to be at the position shown in A in FIG. 4( a ). For this purpose, the top of the head is projected to the projection position B in FIG. 4( b ) by perspective projection. Furthermore, the distance D from directly below the imaging unit 2 to the projection position B can be calculated. In addition, since the height H from the floor to the imaging unit 2 can be measured after installation, it can be set as a known value if it is measured in advance before the actual processing. Then, after the top of the head A is projected onto the floor, the distance D, the height H, and the height h measured and tracked during boarding are known. The numerical value to be obtained here is the actual standing position, that is, the distance d from directly below the imaging unit 2 to the standing position. Furthermore, the distance d can be calculated based on the proportional calculation of the following formula (3).

d={D×(H—h)}/H…(3) d={D×(H—h)}/H...(3)

通过透视投影来投影头顶部的地板上的投影位置,与在摄像部2的位置上设置点光源照射人物20时产生的影子中的头顶部的投影位置相同。在此将从摄像部2投影了高度h的人物20的头顶部A的情况作为了例子。而作为其他的例子,如图4(c)所示,可以在搭乘时测量肩的高度h’,并将肩投影至地板面。在此情况下,求取从摄像部2的正下方到肩的投影位置B’的距离D’,根据距离D’和高度H和高度h’,以与刚才同样的比例计算来求取从摄像部2的正下方到肩的正下方的距离d’。该距离d’严格地说,存在身体的中心正下方与肩位置正下方的差异,但几乎等于距离d(图4(c)),因此既可以将其视为站立位置,也可以对此差异进行校正来求取站立位置。 The projection position on the floor where the top of the head is projected by perspective projection is the same as the projection position of the top of the head in the shadow generated when the point light source is installed at the position of the imaging unit 2 to illuminate the person 20 . Here, a case where the top of the head A of the person 20 at the height h is projected from the imaging unit 2 is taken as an example. As another example, as shown in Fig. 4(c), the height h' of the shoulders can be measured during riding, and the shoulders can be projected onto the floor surface. In this case, the distance D' from directly below the imaging unit 2 to the projected position B' of the shoulder is obtained, and from the distance D', the height H, and the height h', the distance D' from the imaging unit 2 is calculated at the same ratio as before. The distance d' from directly below the part 2 to directly below the shoulder. Strictly speaking, there is a difference between the center of the body directly below the shoulder position and the distance d', but it is almost equal to the distance d (Fig. Calibrate to find the standing position.

此外,搭乘时的人物检测和用于图案匹配跟踪的图案获取由搭乘者高度测量部3进行。另外,人物的图案跟踪由搭乘者跟踪部4进行。 In addition, person detection during boarding and pattern acquisition for pattern matching tracking are performed by the passenger height measurement unit 3 . In addition, the pattern tracking of the person is performed by the passenger tracking unit 4 .

尽管以上说明了从正侧面(以地板为截面的方向)观察的图,但如图5所示,基于朝正下方俯视地板的二维的坐标系,同样的结论也成立。图5是以朝正下方俯视轿厢的地板的朝向表示电梯监视装置中的人物的投影位置与真正的站立位置的关系的图。在图5中,将从摄像部2把由斜线区域表示的人物20的头顶部A投影至地板面时的投影位置设为B。格子区域20’表示投影有人物20的轮廓的位置。在摄像部2的正下方的地板面上取原点,并沿两壁设置作为坐标轴的X轴和Y轴。若将投影位置B的坐标设为(Dx,Dy),人物20的站立位置设为(dx,dy),则它们与式(3)同样地存在下式的关系。 Although the above describes the view viewed from the front side (the direction in which the floor is taken as a cross-section), the same conclusion holds true based on the two-dimensional coordinate system looking down on the floor as shown in FIG. 5 . Fig. 5 is a diagram showing the relationship between the projected position of a person in the elevator monitoring device and the actual standing position in a direction overlooking the floor of the car directly below. In FIG. 5 , B is the projection position when the top of the head A of the person 20 indicated by the shaded area is projected from the imaging unit 2 onto the floor surface. The grid area 20' indicates the position where the silhouette of the person 20 is projected. The origin is taken on the floor surface directly below the imaging unit 2, and X-axis and Y-axis as coordinate axes are set along both walls. Assuming that the coordinates of the projection position B are (Dx, Dy) and the standing position of the person 20 is (dx, dy), they have the following relationship similar to the expression (3).

dx={Dx×(H—h)}/H…(4) dx={Dx×(H—h)}/H...(4)

dy={Dy×(H—h)}/H…(5) dy={Dy×(H—h)}/H...(5)

高度H从最开始起已知,高度h在搭乘时已知,(Dx,Dy)通过透视投影而成为已知。因此,能通过式(4)、式(5)来计算人物20的站立位置(dx,dy)。 The height H is known from the beginning, the height h is known when boarding, and (Dx, Dy) is known by perspective projection. Therefore, the standing position (dx, dy) of the person 20 can be calculated by Equation (4) and Equation (5).

接下来,使用图6来说明搭乘者至地板的投影(步骤s05)中的处理。图6是表示射影变换的具体例的图。图6(a)是表示从正上方俯瞰轿厢的图,图6(b)是表示由摄像部2拍摄出的图像的图。图6(b)的图像上的期望的位置,例如人物的头顶部A如何假想地投影至从正上方俯视的图6(a)的地板上是问题,为了解决该问题而使用射影变换。 Next, the processing in the projection of the occupant to the floor (step s05 ) will be described using FIG. 6 . FIG. 6 is a diagram showing a specific example of projective transformation. FIG. 6( a ) is a diagram showing a bird's-eye view of the car from directly above, and FIG. 6( b ) is a diagram showing an image captured by the imaging unit 2 . How a desired position on the image of FIG. 6( b ), for example, the top of the person's head A is projected onto the floor of FIG. 6( a ) viewed from directly above is a problem, and projective transformation is used to solve this problem.

在图6(a)中,在摄像部2的正下方的地板上规定原点O,与轿厢的电梯门的对面壁平行地规定X轴,与轿厢的左方壁平行地规定Y轴。将长度的单位设为米。另一方面,在图6(b)中,取画面左上为原点0,取画面水平方向朝右为x轴,取垂直方向朝下为y轴。长度的单位设为像素数。而且,将图6(a)的坐标表示为(X,Y)T,将图6(b)的坐标表示为(x,y)T。右上角的T表示转置矩阵。从(x,y)T到(X,Y)T的变换成为旋转变换、平移变换、射影变换的组合,但可知它们是通过式(6),或通过将式(6)经式(6)’而变形成式(6)”表现的。此外,式(6)”是消去了K的表达。在此,有时还将(x,y)T称为摄像头坐标(图像上的坐标),将(X,Y)T称为地面坐标(从正上方俯视时的坐标)。 In FIG. 6(a), the origin O is defined on the floor directly below the imaging unit 2, the X axis is defined parallel to the wall facing the elevator door of the car, and the Y axis is defined parallel to the left wall of the car. Set the unit of length to meters. On the other hand, in Figure 6(b), the upper left of the screen is taken as the origin 0, the horizontal direction of the screen is taken as the x-axis, and the vertical direction is taken as the y-axis. The unit of length is set to number of pixels. Furthermore, the coordinates in FIG. 6( a ) are expressed as (X, Y) T , and the coordinates in FIG. 6( b ) are expressed as (x, y) T . The T in the upper right corner indicates the transpose matrix. The transformation from (x, y) T to (X, Y) T is a combination of rotation transformation, translation transformation, and projective transformation, but it can be seen that they are passed through formula (6), or by formula (6) through formula (6) ' and transformed into formula (6) "expression. In addition, formula (6)" is to eliminate the expression of K. Here, (x, y) T may also be referred to as camera coordinates (coordinates on an image), and (X, Y) T may be referred to as ground coordinates (coordinates when viewed from directly above).

KK ·&Center Dot; xx KK ·&Center Dot; ythe y KK == CC 1111 CC 1212 CC 1313 CC 21twenty one CC 22twenty two CC 23twenty three CC 3131 CC 3232 11 Xx YY 11 .. .. .. (( 66 ))

KK ·&Center Dot; xx == CC 1111 ·&Center Dot; Xx ++ CC 1212 ·&Center Dot; YY ++ CC 1313 KK ·&Center Dot; ythe y == CC 21twenty one ·&Center Dot; Xx ++ CC 22twenty two ·&Center Dot; YY ++ CC 23twenty three KK == CC 3131 ·&Center Dot; Xx ++ CC 3232 ·· YY ++ 11 .. .. .. (( 66 )) ′′

xx == CC 1111 ·· Xx ++ CC 1212 ·· YY ++ CC 1313 CC 3131 ·· Xx ++ CC 3232 ·· YY ++ 11 ythe y == CC 21twenty one ·· Xx ++ CC 22twenty two ·· YY ++ CC 23twenty three CC 3131 ·· Xx ++ CC 3232 ·· YY ++ 11 .. .. .. (( 66 )) ′′ ′′

在式(6)等,K是与摄像部2和被摄体的距离相关的变量,C11~C32是根据摄像部2距离地板的高度、朝向、放大率等所谓的摄像头参数而确定的数值,能预先求取。在此,C11~C32能通过所谓的摄像头校准(cameracalibration)将摄像头坐标与地面坐标的已知的组准备4组以上,来求取参数C11~C32。要求取已知的4组以上的坐标值,例如在电梯的地板上赋予不处于一直线上的4点以上的标记,不仅通过卷尺等的实测来获取这些标记的地面坐标,而且通过图像上的标记位置的坐标来获取摄像头坐标,由此能进行求取。 In formula (6), etc., K is a variable related to the distance between the imaging unit 2 and the subject, and C 11 to C 32 are determined according to so-called camera parameters such as the height, orientation, and magnification of the imaging unit 2 from the floor. Numeric values can be obtained in advance. Here, C 11 to C 32 can obtain parameters C 11 to C 32 by preparing four or more known sets of camera coordinates and ground coordinates by so-called camera calibration. It is required to take more than 4 sets of known coordinate values. For example, mark more than 4 points that are not on a straight line on the floor of the elevator. The coordinates of the marked position are used to obtain the camera coordinates, which can be obtained.

在C11~C32通过摄像头校准而成为了已知后,通过对式(6)”变形而得到的式(7),能将图6(b)的图像上的任意的坐标(x,y)T变换成从图6(b)的正上方俯视时的坐标(X,Y)TAfter C 11 ~ C 32 become known through camera calibration, formula (7) obtained by transforming formula (6)" can transform any coordinate (x, y ) T is transformed into coordinates (X, Y) T when viewed from directly above in FIG. 6( b ).

Xx YY == CC 1111 -- CC 3131 ·· xx CC 1212 -- CC 3232 ·· xx CC 21twenty one -- CC 3131 ·&Center Dot; ythe y CC 22twenty two -- CC 3232 ·&Center Dot; ythe y -- 11 xx -- CC 1313 ythe y -- CC 23twenty three .. .. .. (( 77 ))

为此,在搭乘者到地板的投影(步骤s05)的处理中,搭乘者投影部5基于图6(b)的图像上的头顶部A的坐标,来计算在从图6(a)的摄像部2将头顶部A投影至地板时从投影位置B的正上方俯视时的坐标。 Therefore, in the process of projecting the occupant onto the floor (step s05), the occupant projection unit 5 calculates the coordinates of the top of the head A on the image in FIG. Coordinates when viewed from directly above the projection position B when the part 2 projects the top of the head A onto the floor.

接下来,在搭乘者的站立位置计算(步骤s06)的处理中,搭乘者的站立位置计算部6通过式(3)或者式(4)、(5),基于摄像部2的高度H、搭乘者的高度h、摄像部2的位置与投影位置B之间的位置关系(例如距离D、坐标(Dx,Dy)、距离Dx、距离Dy等),来计算图6(a)中从正上方俯视时的搭乘者的站立位置的坐标(在此情况下为头顶部A的坐标)。此外,式(3)能在使用极坐标系的情况下利用,式(4)、(5)能在使用XY坐标系的情况下利用。 Next, in the process of calculating the standing position of the passenger (step s06), the standing position calculation unit 6 of the passenger uses Equation (3) or Equations (4) and (5) based on the height H of the imaging unit 2 and the height of the passenger. The height h of the person, the positional relationship between the position of the imaging unit 2 and the projection position B (such as distance D, coordinates (Dx, Dy), distance Dx, distance Dy, etc.), to calculate The coordinates of the standing position of the passenger when looking down (in this case, the coordinates of the top of the head A). In addition, Equation (3) can be used when using a polar coordinate system, and Equation (4) and (5) can be used when using an XY coordinate system.

以上说明的搭乘者到地板的投影(步骤s05)以及搭乘者的站立位置计算(步骤s06)针对轿厢内的全部搭乘者来进行。 The projection of the occupant to the floor (step s05 ) and the calculation of the standing position of the occupant (step s06 ) described above are performed for all occupants in the car.

此外,如图4(c)中利用了肩的例子那样,搭乘者高度测量部3测量搭乘者的一部分A(例如头顶部或肩等特定之处)的高度来作为高度h即可,搭乘者投影部5将与此高度h对应的搭乘者的一部分A在画面上的坐标作为输入,能求取从摄像部2将搭乘者的一部分A投影至地板时的从正上方观察时的投影位置B即可。此外,作为投影的对象的搭乘者的一部分期望是在拥挤时难以掩入隐蔽处的头顶部等尽可能高的部分。而且,搭乘者的站立位置计算部6能基于摄像部2的高度H、搭乘者的一部分A的高度h、以及摄像部2的位置与投影位置B之间的位置关系,计算从正上方俯视时的搭乘者的站立位置的坐标(在此情况下为搭乘者的一部分A的坐标,但例如可知是肩等特定的部位的情况下,例如可以使用从搭乘者的中心位置起到一部分A为止的距离等数值来校正为搭乘者的中心位置的坐标,作为搭乘者的站立位置的坐标使用)即可。 In addition, as in the example using the shoulders in FIG. The projection unit 5 receives the coordinates of the part A of the occupant on the screen corresponding to the height h, and can obtain the projection position B when viewed from directly above when the part A of the occupant is projected from the imaging unit 2 on the floor. That's it. In addition, some of the passengers to be projected are expected to have as high a portion as possible, such as the top of the head, where it is difficult to hide in a crowded place. Furthermore, the standing position calculation unit 6 of the passenger can calculate the position when viewed from directly above based on the height H of the imaging unit 2, the height h of a part A of the passenger, and the positional relationship between the position of the imaging unit 2 and the projection position B. The coordinates of the standing position of the occupant (in this case, the coordinates of a part A of the occupant, but if it is known to be a specific part such as the shoulder, for example, the coordinates from the center position of the occupant to the part A can be used The coordinates of the central position of the passenger may be corrected to the coordinates of the center position of the passenger and used as the coordinates of the standing position of the passenger).

接着说明图2的处理的流程中的搭乘者的占有面积估计以及拥挤度判定(步骤s07)。图7是由电梯监视装置估计出轿厢的地板的占有状况的图。图7(a)是假设从轿厢的天花板的相当高的位置俯瞰正下方的情况下所得到的图像的概念图。矩形区域表示轿厢区域30,点描区域是能看见轿厢的地板的区域31。而且,斜线部是搭乘者占据地板的区域32。能认为:随着区域31的比例减少,拥挤度会增加。而且,若区域31的合计面积小于一人的量,则能认为是已经拥塞到无余地程度的拥挤而判断为满员。根据本实施方式,能像一般的防范摄像头那样,对设置于轿厢角落的摄像部2所得到的图像进行处理,得到图7那样的图像或者与之近似的图像,求取人物或行李等相对于地板面的占有率,进行轿厢的拥挤度判定。 Next, the estimation of the occupancy area of the passengers and the determination of the degree of congestion in the flow of the process of FIG. 2 will be described (step s07 ). Fig. 7 is a diagram showing the floor occupancy status of the car estimated by the elevator monitoring device. FIG. 7( a ) is a conceptual diagram of an image obtained when the ceiling of the car is assumed to look down from a relatively high position. The rectangular area represents the car area 30, and the stippled area is an area 31 where the floor of the car can be seen. Furthermore, the shaded portion is the region 32 where the passenger occupies the floor. It can be considered that as the proportion of the area 31 decreases, the degree of congestion increases. Furthermore, if the total area of the area 31 is less than one person, it can be considered that the area is congested to the extent that there is no room, and it can be judged to be full. According to this embodiment, like a general security camera, the image obtained by the imaging unit 2 installed at the corner of the car can be processed to obtain an image like that shown in FIG. Based on the occupancy rate of the floor surface, the congestion degree of the car is judged.

图7(b)是基于步骤s06中求出的搭乘者的站立位置而估计出轿厢的地板的占有状况的图像。以搭乘者的站立位置(A的坐标)为中心且以搭乘者的宽度w为直径的格子图案的圆的区域33是估计为被人物占有的区域。而且,区域31是能看见地板的区域即拥塞有富余的区域。基于该面积的大小来判定拥挤度,在区域31变得小于一人量的情况下判定为满员。在此,宽度w既可以使用在搭乘时测量出的宽度,也可以省略在搭乘时测量宽度w,而使用预先设定的给定的宽度。 FIG. 7( b ) is an image in which the occupancy status of the floor of the car is estimated based on the standing position of the passenger obtained in step s06 . The circle area 33 of the lattice pattern centering on the standing position of the passenger (the coordinate of A) and having the width w of the passenger as the diameter is an area estimated to be occupied by a person. Furthermore, the area 31 is an area where the floor can be seen, that is, an area where there is room for congestion. The degree of congestion is determined based on the size of the area, and it is determined that the area 31 is full when the area 31 is less than one person. Here, as the width w, the width measured during boarding may be used, or the measurement of the width w during boarding may be omitted, and a predetermined predetermined width set in advance may be used.

而实际上,搭乘者所占据的占有区域的大部分不是圆,而是以两肩为长轴的楕圆区域。为此,可以预先统计性地求取以楕圆来对人物的占有区域进行了近似的情况下的长轴与短轴的比例,将以宽度w为长轴的占有楕圆配置于站立位置,如图7(c1)以及图7(c2)那样进行估计。图7(c1)使搭乘者的脸的朝向与楕圆的区域33的短轴方向一致。为了对其实现,例如在步骤s04进行搭乘者跟踪时检测脸特征来检测脸的朝向即可。另一方面,图7(c2)与搭乘者的脸的朝向无关地配置楕圆的区域33。在此方式下无需检测脸的朝向。另外,即使搭乘者的朝向与实际不同,由于占有轿厢的地板面的面积也无大的差异,因此要判定轿厢的拥挤度或满员,与搭乘者的朝向无关地配置楕圆的区域33足够。 In fact, most of the occupancy area occupied by passengers is not a circle, but an elliptical area with the two shoulders as the long axis. For this reason, the ratio of the long axis to the short axis when the occupancy area of the person is approximated by an ellipse can be calculated statistically in advance, and the occupancy ellipse with the width w as the long axis can be arranged at the standing position, Estimation is performed as shown in FIG. 7( c1 ) and FIG. 7( c2 ). FIG. 7( c1 ) aligns the orientation of the passenger's face with the short-axis direction of the elliptical region 33 . In order to realize this, for example, when the passenger is tracked in step s04 , it is sufficient to detect the facial features and detect the orientation of the face. On the other hand, in FIG. 7( c2 ), the area 33 of the ellipse is arranged irrespective of the direction of the face of the passenger. In this way there is no need to detect the orientation of the face. In addition, even if the passenger's orientation is different from the actual one, there is no big difference in the area occupied by the floor surface of the car. Therefore, to determine the degree of congestion or full occupancy of the car, the elliptical area 33 is arranged regardless of the passenger's orientation. enough.

占有面积估计部7例如如图7所示,基于由搭乘者的站立位置计算部6求出的搭乘者的站立位置,来创建表示从正上方俯视轿厢内时的搭乘者占有的区域的估计图像,并估计全部搭乘者所占的占有面积。 For example, as shown in FIG. 7 , the occupied area estimating unit 7 creates an estimate representing the area occupied by the passenger when looking down on the inside of the car from directly above, based on the standing position of the passenger obtained by the standing position calculating unit 6 of the passenger. image, and estimate the occupancy area occupied by all occupants.

拥挤度判定部8根据由占有面积估计部7估计出的全部搭乘者所占的占有面积来判定拥挤度。作为拥挤度的例子,例如,既可以以由占有面积估计部7估计出的全部搭乘者所占的占有面积相对于预先知道的轿厢的面积的比例来表示,也可以将该比例或占有面积划分为几个阶段而以大、中、小或1、2、3等的多个等级来表示拥挤度。另外,若将拥挤度最大的情况考虑为满员,则满员的判定也可以包含在该拥挤度的判定中。另外,仅对是否满员即进行判定,也包含在拥挤度的判定的概念中。作为满员的判定方法,存在地板的面积(从轿厢的面积中减去搭乘者的占有面积)小于给定的阈值(例如1人分)的情况、或若拥挤度或占有面积为表示满员的给定的阈值以上则判定为满员的方法等。 The congestion degree determination unit 8 determines the congestion degree based on the occupied area of all passengers estimated by the occupied area estimation unit 7 . As an example of the degree of congestion, for example, it can be expressed by the ratio of the occupied area of all passengers estimated by the occupied area estimating unit 7 to the area of the car known in advance, or the ratio or the occupied area It divides into several stages and expresses the degree of congestion in a plurality of levels such as large, medium, and small, or 1, 2, and 3. In addition, if the case where the degree of congestion is the largest is considered to be full, the determination of full capacity may be included in the determination of the degree of congestion. In addition, it is also included in the concept of judging the degree of congestion that only judges whether it is full or not. As a method of judging full occupancy, there are cases where the area of the floor (subtracting the occupancy area of passengers from the area of the car) is smaller than a given threshold (for example, per person), or if the degree of congestion or the occupancy area is equal to that indicating full occupancy The method of judging that it is full if it exceeds a given threshold, etc.

接下来,在拥挤度判定输出(步骤s08)的处理中,输出部10输出拥挤度判定部8的判定结果(拥挤度或是否满员等)。 Next, in the process of the congestion degree judgment output (step s08 ), the output unit 10 outputs the judgment result of the congestion degree judgment unit 8 (the degree of congestion, whether it is full or not, etc.).

然后,未图示的电梯控制装置基于该拥挤度判定输出来控制电梯。例如,在满员的情况下即使无停止预定的层有门厅呼梯登记,也进行不响应门厅呼梯而通过等控制。 Then, the elevator control device (not shown) controls the elevator based on the congestion degree determination output. For example, even if a hall call is registered on a floor that is not scheduled to stop when it is full, control such as passing without responding to a hall call is performed.

以上,根据本实施方式,对轿厢内朝向斜下方设置的摄像部的图像进行处理,能进行轿厢的拥挤度判定。 As mentioned above, according to this embodiment, the image of the imaging part installed diagonally downward in a car is processed, and the degree of congestion of a car can be judged.

此外,本发明不限于上述的实施方式,包含各种变形例。例如,上述的实施方式是为了易懂地说明本发明而进行了详细的说明,但无需限定为必须具备已说明的全部构成。 In addition, this invention is not limited to the above-mentioned embodiment, Various modification examples are included. For example, the above-mentioned embodiment has been described in detail for the purpose of explaining the present invention intelligibly, but it is not necessarily limited to having all the configurations described above.

符号说明 Symbol Description

1···运算控制部 1···Operation control unit

2···摄像部 2···camera department

3···搭乘者高度测量部 3···Passenger Height Measurement Department

4···搭乘者跟踪部 4··· Passenger Tracking Department

5···搭乘者投影部 5··· Passenger projection unit

6···搭乘者的站立位置计算部 6···Standing position calculation unit of passengers

7···占有面积估计部 7···Occupied Area Estimation Department

8···拥挤度判定部 8···Congestion Judgment Unit

9···操作控制部 9···Operation Control Department

10···输出部 10···Output

20···人物 20···Character

Claims (6)

1. an elevator monitoring arrangement, possesses in the car to elevator the image pickup part arranged obliquely downward taken,
The feature of described elevator monitoring arrangement is to have:
Passenger height meter, it measures the height of a part for passenger for benchmark with the mark of the known height in described car;
Passenger tracking portion, it follows the tracks of described passenger based on the image photographed by described image pickup part in described car;
Passenger Projection Division, its using the position of a described part for the described passenger in described image on picture as input, calculate when a described part for described passenger being projected to the floor of described car from described image pickup part from directly over observe time projected position;
The standing place calculating part of passenger, the height of a described part of its height based on described image pickup part, described passenger and the position relationship between the position of described image pickup part and described projected position, calculate from directly over overlook time the standing place of described passenger;
Occupied area estimator, it is based on the standing place of described passenger, estimate from directly over overlook in described car time whole passenger shared by occupied area; And
Degree of congestion detection unit, it judges the degree of congestion of described car based on described occupied area.
2. elevator monitoring arrangement according to claim 1, is characterized in that,
Described passenger height meter also measures the width of described passenger,
Described occupied area estimator also utilizes the width of described passenger to estimate the occupied area shared by described whole passenger.
3. elevator monitoring arrangement according to claim 1 and 2, is characterized in that,
Whether described degree of congestion detection unit judges in described car full.
4. an elevator monitoring method, uses in the car to elevator the image taken by the image pickup part arranged obliquely downward taken to monitor,
The feature of described elevator monitoring method is,
Measure the height of a part for passenger for benchmark with the mark of the known height in described car; Based on the image photographed by described image pickup part, in described car, follow the tracks of described passenger; Using the position of a described part for the described passenger in described image on picture as input, calculate when a described part for described passenger being projected to the floor of described car from described image pickup part from directly over observe time projected position; Based on the height of described image pickup part, the height of a described part for described passenger and the position relationship between the position of described image pickup part and described projected position, calculate from directly over overlook time the standing place of described passenger; Based on the standing place of described passenger, estimate from directly over overlook in described car time whole passenger shared by occupied area; And based on described occupied area, judge the degree of congestion of described car.
5. elevator monitoring method according to claim 4, is characterized in that,
Also measure the width of described passenger, also utilize the width of described passenger to estimate the occupied area shared by described whole passenger.
6. the elevator monitoring method according to claim 4 or 5, is characterized in that,
Whether full the judgement of described degree of congestion to comprise in described car judgement.
CN201410017454.XA 2013-01-28 2014-01-15 Elevator monitoring arrangement and elevator monitoring method Active CN103964271B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013012774A JP6033695B2 (en) 2013-01-28 2013-01-28 Elevator monitoring device and elevator monitoring method
JP2013-012774 2013-01-28

Publications (2)

Publication Number Publication Date
CN103964271A CN103964271A (en) 2014-08-06
CN103964271B true CN103964271B (en) 2016-04-20

Family

ID=51234353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410017454.XA Active CN103964271B (en) 2013-01-28 2014-01-15 Elevator monitoring arrangement and elevator monitoring method

Country Status (2)

Country Link
JP (1) JP6033695B2 (en)
CN (1) CN103964271B (en)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106032232B (en) * 2015-03-11 2018-10-19 上海三菱电梯有限公司 The identification device of lift car place holder and recognition methods
JP6474687B2 (en) * 2015-05-27 2019-02-27 株式会社日立製作所 Elevator with image recognition function
CN105184828A (en) * 2015-08-14 2015-12-23 中山大学 Machine vision based method and apparatus for monitoring space utilization rate of cage
JP6503313B2 (en) * 2016-03-28 2019-04-17 株式会社日立製作所 Group management control device and group management control system
JP6593822B2 (en) * 2016-06-08 2019-10-23 三菱電機株式会社 Monitoring device
EP3281904B1 (en) 2016-08-09 2020-03-25 Otis Elevator Company Control systems and methods for elevators
CN106348112A (en) * 2016-08-25 2017-01-25 桂林九马新动力科技有限公司 Elevator control method and system and elevator
CN106276435A (en) * 2016-08-25 2017-01-04 桂林九马新动力科技有限公司 A kind of elevator control method based on crowding, system and elevator
WO2018163243A1 (en) * 2017-03-06 2018-09-13 三菱電機株式会社 Object tracking device and object tracking method
CN107145819A (en) * 2017-03-14 2017-09-08 浙江宇视科技有限公司 A kind of bus crowding determines method and apparatus
EP3406556A1 (en) 2017-05-23 2018-11-28 Otis Elevator Company Elevator doorway display systems for elevator cars
WO2018220782A1 (en) * 2017-06-01 2018-12-06 三菱電機株式会社 Elevator device
JP6479936B1 (en) * 2017-11-09 2019-03-06 東芝エレベータ株式会社 Elevator control system
WO2019176039A1 (en) * 2018-03-15 2019-09-19 三菱電機株式会社 Image processing device, image processing method, and image processing program
EP3543189B1 (en) 2018-03-19 2022-07-27 Otis Elevator Company Elevator car operation based on its occupancy
US11724907B2 (en) 2018-06-14 2023-08-15 Otis Elevator Company Elevator floor bypass
WO2020031231A1 (en) * 2018-08-06 2020-02-13 三菱電機株式会社 Operation management device and operation management program
JP6918897B2 (en) * 2019-10-18 2021-08-11 東芝エレベータ株式会社 Elevator fullness detection system
CN110759196A (en) * 2019-11-15 2020-02-07 深圳技术大学 Elevator volume display method, storage medium and terminal equipment
JP6997819B2 (en) * 2020-02-05 2022-01-18 東芝エレベータ株式会社 Elevator system
JP7461277B2 (en) 2020-10-30 2024-04-03 株式会社日立製作所 Elevator group management control device and platform congestion avoidance control method.
CN113158826A (en) * 2021-03-30 2021-07-23 开易(北京)科技有限公司 Passenger car overload detection method and device and electronic equipment
KR102541959B1 (en) * 2021-04-08 2023-06-12 네이버랩스 주식회사 Elevator control system and method for controlling elevator which robot and human board
CN113420693B (en) * 2021-06-30 2022-04-15 成都新潮传媒集团有限公司 Door state detection method and device, and car passenger flow statistical method and equipment
CN115246610B (en) * 2021-09-10 2023-11-17 菱王电梯有限公司 Elevator car inclination detection method and system and elevator

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1759613A (en) * 2003-03-20 2006-04-12 因温特奥股份公司 Monitoring a lift area by means of a 3d sensor

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2816310B2 (en) * 1994-07-08 1998-10-27 株式会社日立製作所 Object detection device in elevator car
JP3243234B2 (en) * 1999-07-23 2002-01-07 松下電器産業株式会社 Congestion degree measuring method, measuring device, and system using the same
JP4239621B2 (en) * 2003-03-11 2009-03-18 株式会社明電舎 Congestion survey device
JP2006240825A (en) * 2005-03-03 2006-09-14 Hitachi Building Systems Co Ltd Elevator operation control device
JP2007031105A (en) * 2005-07-28 2007-02-08 Hitachi Building Systems Co Ltd Elevator passenger abnormality detection device
JP4793324B2 (en) * 2007-05-30 2011-10-12 株式会社日立製作所 Vehicle monitoring apparatus and vehicle monitoring method
JP2009143722A (en) * 2007-12-18 2009-07-02 Mitsubishi Electric Corp Person tracking apparatus, person tracking method and person tracking program
JP4663756B2 (en) * 2008-04-28 2011-04-06 株式会社日立製作所 Abnormal behavior detection device
CN102334142A (en) * 2009-02-24 2012-01-25 三菱电机株式会社 Human tracking device and human tracking program

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1759613A (en) * 2003-03-20 2006-04-12 因温特奥股份公司 Monitoring a lift area by means of a 3d sensor

Also Published As

Publication number Publication date
CN103964271A (en) 2014-08-06
JP6033695B2 (en) 2016-11-30
JP2014144826A (en) 2014-08-14

Similar Documents

Publication Publication Date Title
CN103964271B (en) Elevator monitoring arrangement and elevator monitoring method
US10755124B2 (en) Passenger counting device, system, method and program, and vehicle movement amount calculation device, method and program
JP6458734B2 (en) Passenger number measuring device, passenger number measuring method, and passenger number measuring program
US9358976B2 (en) Method for operating a driver assistance system of a vehicle
JP6914699B2 (en) Information processing equipment, information processing methods and programs
EP2858037B1 (en) Moving object detector
JP2009143722A (en) Person tracking apparatus, person tracking method and person tracking program
JP4848312B2 (en) Height estimating apparatus and height estimating method
JP2012059030A (en) Human body identification method and human body identification apparatus using range image camera
US20170259830A1 (en) Moving amount derivation apparatus
EP2924612A1 (en) Object detection device, object detection method, and computer readable storage medium comprising object detection program
US20150310296A1 (en) Foreground region extraction device
JP2019530924A5 (en)
JP6139729B1 (en) Image processing device
WO2019016971A1 (en) Number-of-occupants detection system, number-of-occupants detection method, and program
EP3082069A1 (en) Stereoscopic object detection device and stereoscopic object detection method
EP2989611A1 (en) Moving object detection
JP2011209070A (en) Image processor
US20130142388A1 (en) Arrival time estimation device, arrival time estimation method, arrival time estimation program, and information providing apparatus
KR101721655B1 (en) System and method for object detection
TWI755960B (en) Object counting method and monitoring camera
JP4471866B2 (en) Person detection method
KR20150042417A (en) Lane Detection Method and System Using Photography Part
JP6587253B2 (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant