EP1838605A1 - Aufzug - Google Patents

Aufzug

Info

Publication number
EP1838605A1
EP1838605A1 EP05704139A EP05704139A EP1838605A1 EP 1838605 A1 EP1838605 A1 EP 1838605A1 EP 05704139 A EP05704139 A EP 05704139A EP 05704139 A EP05704139 A EP 05704139A EP 1838605 A1 EP1838605 A1 EP 1838605A1
Authority
EP
European Patent Office
Prior art keywords
image
car
information
building
display image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP05704139A
Other languages
English (en)
French (fr)
Other versions
EP1838605A4 (de
Inventor
Aernoud Mitsubishi Elevator Europe BV BROUWERS
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Publication of EP1838605A1 publication Critical patent/EP1838605A1/de
Publication of EP1838605A4 publication Critical patent/EP1838605A4/de
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B3/00Applications of devices for indicating or signalling operating conditions of elevators
    • B66B3/002Indicators
    • B66B3/008Displaying information not related to the elevator, e.g. weather, publicity, internet or TV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • B66B1/3492Position or motion detectors or driving means for the detector
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B3/00Applications of devices for indicating or signalling operating conditions of elevators

Definitions

  • This invention relates to an elevator in which a car and a counterweight ascend and descend in an elevator shaft .
  • Japanese Patent No .3484731 describes a couple of solutions to create such an external image .
  • One solution described is to have one camera per elevator car on the facade that moves synchronous with the elevator . This means that on high rise buildings the camera has to travel along the entire facade of the building at the same (high) speed as the elevator . It is basically a separate elevator system for the camera .
  • the second solution describes a camera that is fixed to the facade of the building at a- height that is equal to half the travel height of the elevator car . To provide the image for the complete travel of the elevator car, the camera changes its viewing angle in vertical direction .
  • each elevator needs a separate camera system. In case the cameras move along the facade of the buildings , a lot of guide systems need to be provided that can overlap windows, etc .
  • Moving the camera by means of a guide system also means the guide system must be perfectly straight to avoid vibration of the camera, because when the camera vibrates the image in the car will vibrate in the same way, but will be magnified due to the camera lens .
  • the result can be that passengers get nauseous especially when the motion of the car does not correspond to the motion on the screen .
  • This invention has been made to solve the above-mentioned problems, and an obj ect thereof is to provide an elevator in which an image corresponding to a car position can be shown inside the car, and which can cut down on costs and allow easy maintenance and inspection .
  • Fig . 1 is a block diagram illustrating an elevator according to an embodiment of this invention .
  • Fig . 2 is a conceptual diagram illustrating images of individual cameras before processing takes place in a processing unit shown in Fig . 1 , resulting in the corrected image .
  • Figs . 1 , 2 show the principle of this system.
  • Fig . 1 is a block diagram illustrating an elevator according to an embodiment of this invention .
  • a vertically extending elevator shaft 2 is provided inside a building 1.
  • a hoisting machine (not shown) serving as a drive device is installed inside the elevator shaft 2.
  • a main rope 5 is wound around a drive sheave of the hoisting machine .
  • a car 4 and a counterweight (not shown) are suspended inside the elevator shaft 2 by the main rope 5.
  • the car 4 is raised and lowered inside the elevator shaft 2 by the drive force of the hoisting machine .
  • a screen 3 is mounted inside the car 4.
  • the screen 3 is used as a virtual window to show the surroundings of the building .
  • the screen 3 is ultra thin and is transmissive, reflective, or emissive .
  • Attached to a facade 10 of the building 1 are a plurality of cameras (imaging devices ) 6 each viewing a part of the surroundings of the building 1.
  • the cameras 6 are fixed to the building 1 in a stationary, non-pivoting manner .
  • the cameras 6 are so spaced from each other that their respective imaging areas 7 are shifted from each other .
  • the cameras 6 are provided over the entire height of the building and equally spaced.
  • Each imaging area 7 partially overlaps a part of its adj acent imaging area 7. Further, each camera 6 converts information that is obtained as the camera partly views the surroundings of the building 1 into electrical image information for output .
  • the surroundings of the building 1 are electronically captured by a plurality of cameras 6 that are attached to the buildings facade 10.
  • Each camera 6 views an imaging area 7 of the surroundings of the building 1.
  • Each imaging area 7 partly overlaps the imaging area 7 of the adj acent camera 6 to ensure that the entire surroundings are captured .
  • each camera 6 corresponds with the edges of the adj acent imaging area 7. To do this , however, a lot of expensive physical adj ustment of each camera 6 is required. To keep the procedure simple, the cameras 6 are mounted less accurately to the facade 10 , but with an overlap of each imaging area 7. The adj ustment is later performed electronically by means of a processing unit 9.
  • a processing unit ( image processing device ) 9 for processing image information from the individual cameras ⁇ is electrically connected to the individual cameras 6.
  • the processing unit 9 is electrically connected to the individual cameras 6 via an information communication network (network) 16 consisting of wired cables (wires ) .
  • a car position detecting device (not shown) for detecting the position of the car 4 is electrically connected to the processing unit 9.
  • the processing unit 9 calculates the position of the car 4 based on position information from the car position detecting device .
  • the actual position of the car 4 is constantly sent to the processing unit 9.
  • the position information is used to decide the part (the display image portion)
  • position information indicative of the position of the car 4 is constantly sent from the car position detecting device to the processing unit 9.
  • the processing unit 9 is electrically connected to the proj ector .
  • the processing unit 9 is electrically connected to the proj ector via a link 17 consisting of a wired cable (wire ) .
  • the proj ector is adapted to show an image on the screen 3 based on information from the processing unit 9.
  • the processing unit 9 selects a part of the image information as a display image portion 8 and then performs processing to show the display image portion 8 on the screen 3. Specifically, the processing unit 9 selects as the display image portion 8 a portion of the individual image information which corresponds to the position of the car 4 (a portion within a fixed area at the same height as the position of the car 4 ) , performs corrections on the display image portion 8 , and then sends information of the corrected display image portion 8 to the proj ector .
  • the processing unit 9 can first collect all images from the cameras 6 via a network 16 and create one image out of it before selecting the display image portion 8 that is to be shown on the screen 3 via the link 17.
  • the proj ector shows an image on the screen 3 based on the information of the corrected display image portion 8. As a result , the view of the surroundings of the building 1 as seen from the position of the car 4 is shown on the screen 3.
  • the motion on the screen 3 in the car 4 must be perfectly synchronized with the car motion; otherwise passengers inside the car 4 might experience conflicts in seeing the images and feeling the car motion resulting in nausea .
  • the cameras 6 should slightly overlap in their images .
  • a correction vector is determined to align and rotate the images , such that it corresponds with the other cameras 6. The reason a correction is required, is that physically aligning each camera 6 is difficult and expensive .
  • the processing unit 9 grasps the actual position of the car 4 based on position information from the car position detecting device, and selects the images to show on the screen 3 from among image information from the individual cameras 6 based on the grasped position of the car 4.
  • the processing unit 9 performs processing for combining the selected image information . Further, the processing unit 9 can repeat this processing . As a result, the image shown on the screen 3 is continuously updated as the car 4 moves .
  • the images of one or more cameras 6 are selected and if necessary combined to create the actual view corresponding to the car position . This view is made visible on the screen 3 in the car 4.
  • Figure 2 shows the images 11-14 of the individual cameras 6 before processing takes place in the processing unit 9, resulting in the corrected image 15.
  • the positions and angles of images 11 to 14 obtained from the individual cameras 6 differ from each other due to, for example, errors in mounting the individual cameras 6 or the like .
  • the processing unit 9 adj usts the respective positions and angles of the images 11 to 14 such that the images 11 to 14 partly overlap each other . That is , the processing unit 9 selectively rotates and shifts each of the images 11 to 14 to align them (that is , to correct the images 11 to 14 ) , creating one total image 15 of the corrected images 11 to 14. Further, the processing unit 9 selects a part of the total image 15 as the display image portion 8 based on position information from the car position detecting device, and sends information of the display image portion 8 to the proj ector .
  • the processing unit 9 shifts and/or rotates each of the images 11-14 to be able to create one total image 15.
  • each camera 6 views a part of the surroundings of the building 1 in each of multiple imaging areas 7 and outputs image information corresponding to each imaging area 7
  • the car position detecting device detects the position of the car 4 to output position information
  • the processing unit 9 selects the display image portion 8 based on the image information and the position information and performs processing for showing the display image portion 8 on the screen 3 inside the car 4 , whereby the continuously changing image of the surroundings of the building 1 can be shown on the screen 3 inside the car 4 without displacing the individual cameras 6 relative to the building 1.
  • This configuration not only reduces the trouble associated with the mounting of the individual cameras 6 on the building 1 but also makes it possible to use inexpensive, commonly mass-produced cameras such as CCD or CMOS sensors as the cameras 6, enabling a reduction in cost . Further, the simplified mounting structure of the cameras 6 facilitates easy maintenance and inspection . Furthermore, less esthetic problems are involved with respect to the building facade, and the individual cameras 6 can be mounted on the building 1 without spoiling the exterior appearance .
  • each imaging area 7 partially overlaps a part of its adj acent imaging area 7 , which ensures that there will be no area that is not viewed by the cameras 6 due to an error in mounting the cameras 6, allowing continuous viewing of the surroundings of the building 1 with greater reliability.
  • image information from each camera 6 is electrically processable information, making it possible to process the image information with ease and at greater speed.
  • the processing unit 9 acquires image information from all the individual cameras 6, whereby it is not necessary for the individual cameras 6 to store the image information and the cameras 6 can be further simplified in structure .
  • the screen 3 for showing the display image portion 8 is provided inside the car 4 , ensuring increased sharpness of the image shown .
  • processing unit 9 acquires image information from all the individual cameras 6 before selecting the display image portion 8
  • the processing unit 9 may acquire from the cameras 6 only the display image portion 8 selected based on position information from the car position detecting device .
  • the processing unit 9 calculates which of the cameras 6 are nearest to the car position and retrieves only those images via the network 16 and processes only those images to create the display image portion 8 that is to be shown on the screen 3 via the link 17.
  • each camera 6 is provided with a storage portion for storing a compressed form of image information taken in each imaging area 7.
  • the portion of the image information selected by a request from the processing unit 9 is sent to the processing unit 9 as the display image portion 8. That is , several sets of image processing are distributed among the individual cameras 6 and the processing unit 9 for execution . This means that each camera 6 previously performs pre-processing to compress the image, and sends the compressed image to the processing unit 9 upon request .
  • the amount of information to be processed by the processing unit 9 can be reduced, making it possible to increase the throughput of the processing unit 9.
  • the processing unit 9 includes a storage portion (memory) for storing the image data to be superimposed and performs processing for superimposing the image of the image data to be superimposed on the display image portion 8 and showing the resulting image on the screen 3 inside the car 4.
  • the image of the image data to be superimposed is handled as an additional image different from that of the surroundings of the building 1.
  • the view could be virtually restricted by a wall that is added ( superimposed) on the image .
  • the processing unit 9 effects changes to the image to be shown inside the car 4 based on information indicative of the manipulations with the manipulation device for change .
  • the passengers can manipulate the system such that they can shift the image or zoom on to a location that they are interested in through manipulations with the manipulation device for change .
  • the screen 3 not only for showing the surroundings of the building 1 , but also to show the current floor, advertisements, outside weather information such as temperature and humidity, etc .
  • the image of the surroundings of the building 1 is shown inside the car 4 with respect to only one car 4 that is raised and lowered in the elevator shaft 2
  • the image of the surroundings of the building 1 may be shown inside the car with respect to each of the individual cars .
  • multiple car position detecting devices which independently detect the positions of the individual cars , are provided in the elevator shaft of each elevator . Further, the processing unit 9 selects multiple display image portion 8 corresponding to the individual cars based on position information from the individual car position detecting devices , and sends the corresponding display image portion 8 to each car . That is , the image corresponding to the individual car position is shown inside each car .
  • the image corresponding to each individual car can be shown inside each car through processing by a common processing unit 9, thus further facilitating image processing .
  • the processing unit 9 can acquire image information from the individual cameras 6 common to the processing unit 9, whereby one camera system can service a group of multiple elevators , making it unnecessary to mount multiple cameras on the building 1 in association with individual elevators . Therefore, the number of cameras can be reduced, leading to a reduction in cost .
  • the processing unit 9 is able to reproduce the entire surroundings it is possible to take multiple display image portions 8 to service not one elevator but multiple elevators , each car showing the surroundings corresponding to the individual car location .
  • the network 16 and the link 17 consist of wires

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Indicating And Signalling Devices For Elevators (AREA)
  • Cage And Drive Apparatuses For Elevators (AREA)
EP05704139A 2005-01-20 2005-01-20 Aufzug Withdrawn EP1838605A4 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2005/001029 WO2006077654A1 (en) 2005-01-20 2005-01-20 Elevator

Publications (2)

Publication Number Publication Date
EP1838605A1 true EP1838605A1 (de) 2007-10-03
EP1838605A4 EP1838605A4 (de) 2012-06-27

Family

ID=36692052

Family Applications (1)

Application Number Title Priority Date Filing Date
EP05704139A Withdrawn EP1838605A4 (de) 2005-01-20 2005-01-20 Aufzug

Country Status (3)

Country Link
EP (1) EP1838605A4 (de)
CN (1) CN1942384A (de)
WO (1) WO2006077654A1 (de)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010004547A1 (en) * 2008-06-17 2010-01-14 Digigage Ltd. System for altering virtual views
DE102013016921A1 (de) * 2013-10-11 2015-04-16 Oliver Bunsen Bildanzeigesystem und -verfahren zur bewegungssynchronen Bildanzeige in einem Transportmittel
CN104787633B (zh) * 2015-04-17 2017-04-12 管存忠 一种单机实时同步摄像观光电梯
EP3317216B1 (de) * 2015-07-03 2020-11-04 Otis Elevator Company Aufzugkabinenwandabbildungssystem und -verfahren
US10961082B2 (en) 2018-01-02 2021-03-30 Otis Elevator Company Elevator inspection using automated sequencing of camera presets
US10941018B2 (en) 2018-01-04 2021-03-09 Otis Elevator Company Elevator auto-positioning for validating maintenance

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5485897A (en) * 1992-11-24 1996-01-23 Sanyo Electric Co., Ltd. Elevator display system using composite images to display car position
JPH1179580A (ja) * 1997-09-03 1999-03-23 Mitsubishi Denki Bill Techno Service Kk エレベーターの外部映像表示装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63139277U (de) * 1987-03-06 1988-09-13
JP2733002B2 (ja) * 1992-11-24 1998-03-30 三洋電機株式会社 エレベータシステム
JP3484731B2 (ja) * 1993-09-17 2004-01-06 三菱電機株式会社 エレベータ用映像情報システム
JPH09194167A (ja) * 1996-01-19 1997-07-29 Sanyo Electric Co Ltd エレベータシステム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5485897A (en) * 1992-11-24 1996-01-23 Sanyo Electric Co., Ltd. Elevator display system using composite images to display car position
JPH1179580A (ja) * 1997-09-03 1999-03-23 Mitsubishi Denki Bill Techno Service Kk エレベーターの外部映像表示装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2006077654A1 *

Also Published As

Publication number Publication date
WO2006077654A1 (en) 2006-07-27
EP1838605A4 (de) 2012-06-27
CN1942384A (zh) 2007-04-04

Similar Documents

Publication Publication Date Title
WO2006077654A1 (en) Elevator
CN106559651B (zh) 飞行器的虚拟窗户
US9628772B2 (en) Method and video communication device for transmitting video to a remote user
EP2818948B1 (de) Verfahren und Datenpräsentationsvorrichtung zur Unterstützung eines entfernten Benutzers zur Bereitstellung von Anweisungen
JP5026067B2 (ja) エレベータリモート点検システム
US9571798B2 (en) Device for displaying the situation outside a building with a lift
EP1968321B1 (de) Vorrichtung und System zur Überwachung von eingedrungenen Objekten
KR101981850B1 (ko) 공동 원통형 회전 풀컬러 led 디스플레이 장치
WO2004106857A1 (ja) ステレオ光学モジュール及びステレオカメラ
WO2004106858A1 (ja) ステレオカメラシステム及びステレオ光学モジュール
CN101557975A (zh) 站台屏蔽门
JP2012103921A (ja) 通行車両監視システム及び車両用監視カメラ
WO2018180310A1 (ja) 監視システム及び監視方法
JP4475164B2 (ja) 監視システム及び監視方法
JP2012011989A (ja) 駅ホームの監視カメラシステム
JP5634222B2 (ja) 通行車両監視システム及び車両用監視カメラ
JP2023024827A (ja) エレベーター
CN111355904A (zh) 一种矿井内部全景信息采集系统与展示方法
JP6955584B2 (ja) ドア映像表示システムおよびモニタ
JP5506656B2 (ja) 画像処理装置
KR100847182B1 (ko) 엘리베이터
JPWO2020039897A1 (ja) 駅監視システム及び駅監視方法
JP5210251B2 (ja) エレベーターの風景映像表示装置
JP6230223B2 (ja) 表示方向制御システム及び表示位置制御システム
CN112189339B (zh) 自动扶梯的图像监视系统

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20060921

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): DE

DAX Request for extension of the european patent (deleted)
RBV Designated contracting states (corrected)

Designated state(s): DE

A4 Supplementary search report drawn up and despatched

Effective date: 20120530

RIC1 Information provided on ipc code assigned before grant

Ipc: B66B 1/34 20060101ALI20120523BHEP

Ipc: B66B 3/00 20060101AFI20120523BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20120829