EP3575257A1 - Control of elevator with gaze tracking - Google Patents
Control of elevator with gaze tracking Download PDFInfo
- Publication number
- EP3575257A1 EP3575257A1 EP18175194.2A EP18175194A EP3575257A1 EP 3575257 A1 EP3575257 A1 EP 3575257A1 EP 18175194 A EP18175194 A EP 18175194A EP 3575257 A1 EP3575257 A1 EP 3575257A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- control panel
- camera
- eye
- control
- person
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66B—ELEVATORS; ESCALATORS OR MOVING WALKWAYS
- B66B1/00—Control systems of elevators in general
- B66B1/34—Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
- B66B1/46—Adaptations of switches or switchgear
- B66B1/468—Call registering systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66B—ELEVATORS; ESCALATORS OR MOVING WALKWAYS
- B66B2201/00—Aspects of control systems of elevators
- B66B2201/40—Details of the change of control mode
- B66B2201/46—Switches or switchgear
- B66B2201/4607—Call registering systems
- B66B2201/4638—Wherein the call is registered without making physical contact with the elevator system
Definitions
- the present invention relates to a method for controlling an elevator system, a control system for the elevator system and to the elevator system.
- buttons may be out of reach to individuals in a wheel-chair, the use of crutches or other aides may pose a problem, or decreased motor function may make it difficult to press the intended button. Others may be hesitant to press the buttons for sanitary reasons or may want the convenience to call a floor hands-free if they are holding items in both of their hands.
- JP 2010 100 370 describes that a line of sight of a passenger of an elevator car is determined and that the elevator car is stopped at the next floor, when the passenger looks at a specific location. This may be used as alternative to a security button.
- WO 2011 114 489 A1 relates to a guide device for an elevator.
- the guide device comprises a camera, which takes pictures around the entrance of the elevator. It is detected, whether or not a person has entered the elevator or not based on the pictures.
- WO 2005 56251 A1 describes an elevator system with a camera, which detects the face of a person and determines therefrom, whether the person uses a wheel-chair or not.
- An aspect of the present invention relates to method for controlling an elevator system.
- the method may be automatically performed by a control system of the elevator system and/or may be implemented as a computer program.
- an elevator system may be any device adapted for transporting persons and/or goods vertically with an elevator car.
- An elevator system may be installed in a building in an elevator shaft.
- the method comprises: detecting the presence of a person in front of a control panel of the elevator system with a presence detection sensor; in the case a person has been detected, detecting the presence of at least one eye of a person in front of the control panel from a video stream of a camera of the control panel; in the case, when the eye has been detected, determining a gaze point of the eye on the control panel from a video stream of the camera and/or a further camera, wherein a line of sight of the eye is determined from an image of the eye in the video stream and the gaze point is determined by intersecting the line of sight with a component of the control panel stored in a virtual layout of the control panel; determining a selected floor from the gaze point; and controlling the elevator system to move an elevator car to the selected floor.
- the control system of the elevator system may wait for a person to appear in front of a control panel.
- a camera may start to work and/or the video stream of the camera may be analysed, whether an eye of the person is visible for the camera or not.
- an eye detection module analyses the video stream and/or one or more images of the video stream.
- the eye detection module may be based on a neural network and/or machine learning and/or may have been trained to detect a portion of an image that contain an eye.
- the video stream may be analysed to determine a line of sight of the eye and/or whether the person is looking at the control panel.
- a gaze tracking module that analyses one or more video streams of one or more of the cameras may perform the determination of the line of sight. For example, from the portion of the image and/or video stream that has been detected, the gaze tracking module may determine reflections on the eye. The reflections may have been caused by infrared lights and/or by other light sources, such as an elevator car lighting. From the reflections an orientation of the eye and/or view direction may be determined. From the position of the eye and the orientation a line of sight of the eye may be determined.
- a gaze point on the control panel may be determined, which also may be performed by the gaze tracking module.
- the position of the camera and/or of the control panel and/or optionally a virtual layout of the control panel may be stored in a controller, which performs the method. From this information, it may be determined at which part of the control panel the person is looking. For example, such a part may be a display and/or may be a button, such as a floor selection button, of the control panel.
- the elevator car then may be controlled to move to this floor.
- the virtual layout may comprise the positions and/or extensions of buttons on the control panel, the positions and/or extensions of visual control commands on a display of the control panel, etc.
- a floor selection is cancelled by looking at a visual cue, such as a text, a symbol, etc. and then moving the gaze point to the previously selected floor.
- a visual cue also may be displayed as visual control command on a display.
- an elevator floor destination may be selected requiring no tactile input.
- the method may use eye-tracking and gaze point detection algorithm in order that a person can send command prompts to the control panel hands-free.
- the method further comprises: playing an audio prompt indicating the person to look at the control panel, when a person has been detected and no eye has been detected.
- an audio prompt with instructions, how a floor can be selected with gaze tracking may be output via a loudspeaker.
- the method further comprises: determining the presence of the eye with a second camera of the control panel, when no eye has been detected with the camera being a first camera.
- the control panel may comprise more than one camera, which is used for eye tracking.
- the controller analyses the video stream of the first camera and is not able to detect an eye, it may switch to the video stream of another camera.
- the video stream from the second camera may be analysed for the presence of an eye and/or, when an eye has been detected for a line of sight and/or the gaze point.
- a position of the second camera relative to the control panel may be stored in the controller.
- the second camera is installed lower than the first camera.
- the first camera or the second camera
- the first camera may be installed in a height adapted for eye tracking of a standing grown-up person.
- the second camera or the first camera
- the gaze point is determined with the second camera. It may be that the first camera is solely used for determining whether an eye is visible or not and the second camera is then used for eye tracking. For example, the first camera may be less power consuming as the second one and this may save power consumed by the control system.
- the method further comprises: displaying visual control commands on a display of the control panel.
- control commands may comprise commands like "Do you need help? Yes/No", "Do you want to move up? Yes/No", etc.
- the control commands may provide possibilities to control the movement of the elevator car and/or the selection of a floor.
- the display may show visual symbols and/or text as control commands.
- the display and/or screen may be integrated into the control panel or may be provided in the elevator car. It may allow a person to interact with the control system with his or her eye movement. The person may interact with control command prompts on the display screen in an emergency scenario and/or also may allow a person at an offsite location to control what is being displayed on the display.
- the method further comprises: selecting a control command by determining, whether the gaze point is on the control command.
- the selected floor then may be determined with the selected control command.
- the control command may be a number representing a floor. When the gaze point stays on this number, the control system may decide that the respective floor has been selected.
- the method further comprises: selecting a button on the control panel by determining, whether the gaze point is on the button.
- the selected floor then may be determined from the selected button. It has to be noted that the same button also may be pressed for selecting the same floor. With the method, the floor may not solely be selected by pressing a button, but also by looking at the button. It also may be possible that other operations are imitated on the control panel, such as generating an emergency call, when the person looks at an emergency button.
- a button press on a special button of the control panel may be used to cancel the last stored floor call and/or any other operation.
- the presence detection sensor is a motion detection sensor.
- the presence detection sensor may be an infrared sensor, an electromagnetic sensor, an ultrasonic sensor, etc.
- the presence detection sensor may be integrated into the control panel.
- the presence detection sensor may be a sensor different from the camera providing the video stream for gate tracking. This may provide the ability to interface with a secondary sensor to control when the one or more cameras and/or other parts of the control system are active, off, in a power-saving state and/or any other mode of operation.
- presence detection of a person in front of the control panel also may be performed by analysing the video stream from a camera, such as the first and/or second camera.
- a further aspect of the invention relates to a control system for an elevator system, which comprises a control panel and a controller adapted for performing the method as described in the above and in the following.
- the control panel may be installed in an elevator car and/or at a door of the elevator system.
- the controller may be part of the control panel. It also may be that the controller comprises several parts, for example a part in the control panel and a further part in the central controller of the elevator system, which, for example, may be installed near a drive of the elevator system.
- the control system may be adapted for controlling elevator calls that select the desired floor by tracking eye movements based on where and/or what a person is looking at.
- the controller may comprise one or more processors, which are adapted for performing the method, when a corresponding computer program is executed with them.
- the control panel comprises at least one camera adapted for eye tracking.
- the video stream from the at least one camera may be analysed to determine a line of sight of the eye. It also may be that the video stream of the at least one camera is presented to a central processing center to monitor for safety reasons.
- the central processing center may be connected to a control system of the elevator system by Internet and/or the video stream may be transmitted via Internet.
- the control panel comprises a first module and a second module.
- a module may be mechanically interconnected components that can be installed as one unit in the elevator car and/or other position.
- the first module of the control panel may comprise buttons, such as floor selection buttons, and a camera.
- the camera may be used for eye tracking.
- the second module of the control panel may comprise a display and a further camera.
- the display may be used for presenting control commands to a person in front of the control panel.
- the further camera alternately and/or additionally may be used for eye tracking.
- a video stream of the further camera may be transmitted to a central processing center for monitoring the place in front of the control panel.
- control panel as described in the above and in the following also may be provided as one module.
- the control panel comprises buttons for manually selecting a floor. These buttons may be part of the first module.
- the control panel comprises a display for displaying control commands.
- the display device also may be used as a display device for deaf users. For example, information on how to use the gaze tracking method may be displayed in text on the display.
- the display may be part of the second module.
- the control panel comprises a loudspeaker for outputting audio prompts.
- the loudspeaker also may offer hearable information for blind users.
- a loudspeaker may be part of the first module and/or the second module.
- control panel comprises a presence detection sensor.
- the presence detection sensor may be part of the first module.
- a further aspect of the invention relates to an elevator system, which comprises an elevator car movable in an elevator shaft and a control system as described herein.
- the control panel of the control system may be installed in the elevator car. However, it also may be possible that the control panel is installed at a door of the elevator system for getting access to the elevator car.
- features of the elevator system and the control system as described herein may be features of the method for controlling the elevator system, and vice versa.
- Fig. 1 shows an elevator system 10 comprising an elevator car 12 movable in an elevator shaft 14 by a drive 16.
- the elevator system 10 furthermore comprises a central controller 18 (which may be a part of the drive 16 or at least arranged near the drive 16) for controlling the drive 16 and further equipment of the elevator system 10.
- the central controller 18 may also control elevator doors 20.
- the central controller 18 may receive electronic control commands from a control panel 22 inside the elevator car 12. It also may be that here and in the following the control panel 22 is installed outside of the elevator car 12, for example besides one of the doors 20. The control panel 22 and the central controller 18 may be seen as a control system 24 of the elevator system 10.
- the control panel 22 comprises a first module 26 and a second module 28, which will be described in more detail with respect to Fig. 2 .
- the first module 26 comprises floor select buttons 30, an upper camera 32 arranged above the buttons 30, a lower camera 34 arranged below the buttons 30 and a presence detection sensor 36.
- the floor select buttons 30 may be used for selecting a floor to which the elevator car should move. There may be a button 30 for each floor. When a person pushes the respective button 30, a corresponding electronic command is sent via a local controller 38 to the central controller 18, which then controls the elevator system 10 to move the elevator car 12 to the respective floor.
- the local controller 38 may be part of the control panel 22.
- the camera 32 may generate a video stream that may be analysed by the local controller 38 to determine face data and/or retinal data of a person. Also the camera 34 may generate a video stream to determine face data and/or retinal data of a person. This may be performed additionally or alternatively with respect to the video stream of the camera 32.
- the camera 34 may be a low-angle emergency and/or disability camera 34, for example, for high-stress situations and/or for persons with eyes on a lower level, such as persons in a wheel-chair or children.
- the lower camera 34 may provide more accurate registering of information for a person in a wheelchair or of shorter stature.
- the camera 34 may be seen as a secondary camera 34 and/or may be used instead of the primary camera 32, when the secondary camera 34 is more viable based on the position of the person in the elevator car 12. Additionally or alternatively, the camera 32 and the camera 34 may be used in conjunction with each other.
- the local controller 38 may determine a line of sight of an eye of a person and a gaze point of the person, in particular a gaze point on the module 26 and/or the module 28.
- the presence detection sensor 36 may be arranged above the buttons 30 and/or the camera 32.
- the presence detection sensor 36 may be adapted for detecting the presence of a person in front of the control panel 22.
- the presence detection sensor 36 is adapted for sensing changes in infrared radiation and/or an ultrasonic sound, which may be caused by a human body in front of the control panel 22.
- the module 28 may be arranged above the module 26.
- Module 28 comprises a display 40 and one or more further cameras 42.
- the camera(s) 42 may generate a video stream, which is evaluated by the controller 38 for face tracking and/or gaze tracking data. For example, with the camera 42, gaze tracking data relating to the display 40 may be generated and analysed by the controller 38.
- the video stream of the camera 42 may be transmitted to a central processing center, for example via the controller 38 to the controller 18, which may be connected to the central processing center via Internet.
- the camera 42 may be used for multiple purposes besides gaze detection including but not limited to in-car monitoring.
- the display 40 may be used for displaying text prompts or visual control commands 44, such as "Yes”, “No", “Up”, “Down”, etc. For example, a selection of choice may be stated, such as “Yes” and “No”. A text may be added such as "Do you need help?". When the display 40 is not used for any other functions, it may serve as an emergency services device.
- Module 28 also may comprise a loudspeaker or audio speaker 46, for example for prompting and interacting with a person inside the elevator car 12.
- Fig. 3 shows a flow diagram of a method that may be executed by the control system 24.
- control system 24 most of the control functions are described with respect to the local controller 38. However, it has to be understood, that these functions also may be performed by the controller 18 or by a combination of both controllers 18, 38.
- step S10 only the presence detection sensor 36 may be active, i.e. measurements may be performed with the presence detection sensor 36 and evaluated with the local controller 38.
- Other components of the control panel 22, such as the cameras 32, 34, 42 and the display 40, may be inactive.
- Step S12 if the presence detection sensor 36 does not detect a human presence, the control system 24 continues in a passive state. The control system 24 returns to step S10 and the components 32, 34, 42 may remain inactive.
- steps S14 when the presence of a person in front of a control panel 22 of the elevator system 10 is detected, the control system 24 is switched to an active state, which may mean that the eye-tracking functionality is powered on.
- the camera 32 may be powered on, which then generates a video stream, which is analysed by the local controller 38.
- step S16 the local controller 38 starts to detect, whether there is an eye visible in the video stream.
- the local controller 38 may detect, whether or not human eyes are readable by scanning and analysing retinal data in the video stream.
- the local controller 38 may comprise an eye detection module, which, for example based on a neural network and/or machine learning, has been trained to detect portion of images that contain an eye.
- an eye detection module which, for example based on a neural network and/or machine learning, has been trained to detect portion of images that contain an eye.
- the presence of an eye of a person in front of the control panel 22 may be detected with one or more video streams from one or more of the cameras 32, 34, 42 of the control panel 22.
- camera 34 may be powered on and the video stream of the camera 34 may be analysed. The presence of the eye may then be determined with the second camera 34.
- step S18 when no eye has been detected (either with one or with more cameras 32, 34, 42), the local controller 38 plays an audio prompt indicating the person to look at the control panel 22.
- step S20 when the eye has been detected, the video stream of the respective camera 32, 34, 42 is analysed for determining a gaze point of the eye. It may be that the gaze point is firstly determined with the first camera 32 and, if this is not possible, secondly with the second camera 34.
- the local controller 38 may comprise a gaze tracking module that analyses one or more video streams of one or more of the cameras. For example, from the portion of the image and/or video stream that has been detected in step S14, the gaze tracking module may determine reflections on the eye. The reflections may have been caused by infrared lights included into the control panel 22 and/or by other light sources, such as an elevator car lighting. From the reflections an orientation of the eye and/or view direction may be determined. From the position of the eye and the orientation a line of sight of the eye may be determined.
- the gaze point then may be determined by intersecting the line of sight with a component of the control panel 22, which component is stored in a virtual layout of the control panel.
- the virtual layout may comprise the positions and/or extensions of the first and/or second module 26, 28, the positions and/or extensions of the buttons 30, the positions and/or extensions of the visual control commands 44, etc.
- the virtual layout may be stored in the local controller 38 and/or may be updated by the local controller 38, for example when the visual control commands 44 change.
- visual control commands 44 may be displayed on the display 40 of the control panel 22.
- step S22 the local controller 38 detects, whether or not the person is looking at the control panel 22 and/or at which component of the control panel 22 the person is looking.
- control system 24 may return to step S20 and, for example, may output user instructions via the display 40 and/or the loudspeaker 46. For example, if the eyes are not readable, an audio and or visual prompt may alert the person stating instructions on how to improve the chances of the person's eyes being readable. Similarly, if the person is not looking at the control panel 22, additional user instructions may be given.
- step S24 if the gaze point is on the control panel 22, the control system 24 identifies, which floor is selected with the gaze point, i.e. a selected floor is determined from the gaze point. This may be done by determining, whether the person is looking at a specific button 30 or with the visual control commands 44 on the display 40.
- a control command 44 may be selected, by determining, whether the gaze point is on the control command 44.
- the selected floor then may be determined with the selected control command 44. For example, when the control command is "Up”, the selected floor may be the next floor above a current floor. By looking longer at the control command "Up”, the number of floors above the current floor may be increased. Analogously, with the control command "Down", a floor below the current floor may be selected.
- buttons 30 Another possibility is that the person is looking at one of the buttons 30.
- a button 30 on the control panel 22 may be selected by determining, whether the gaze point is on the button 30. The selected floor then may be determined from the selected button 30.
- a layout of the control panel 22 may be stored in the local controller 38, which then can determine from the gaze point the component of the control panel 22, the person is looking at.
- This layout may comprise the positions of the buttons 30 and/or the display 40.
- step S26 an electronic control command is sent to the central controller 18, which floor has been selected.
- This electronic control command may be the same as the one, which is generated, when a corresponding floor select button 30 is pushed.
- step S28 the elevator system 10 controls the drive 16 to move the elevator car 12 to the selected floor. Furthermore, other equipment, such as the doors 20, may be controlled based on the electronic control command.
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Elevator Control (AREA)
- Indicating And Signalling Devices For Elevators (AREA)
Abstract
A method for controlling an elevator system (10) comprises: detecting the presence of a person in front of a control panel (22) of the elevator system (10) with a presence detection sensor (36); in the case a person has been detected, detecting the presence of an eye of a person in front of the control panel (22) from a video stream of a camera (32, 34, 42) of the control panel (22); in the case, when the eye has been detected, determining a gaze point of the eye on the control panel (22) from a video stream of the camera (32, 34, 42) and/or a further camera (32, 34, 42), wherein a line of sight of the eye is determined from an image of the eye in the video stream and the gaze point is determined by intersecting the line of sight with a component of the control panel stored in a virtual layout of the control panel (22); determining a selected floor from the gaze point; and controlling the elevator system (10) to move an elevator car (12) to the selected floor.
Description
- The present invention relates to a method for controlling an elevator system, a control system for the elevator system and to the elevator system.
- Certain disabilities or circumstances may make it difficult for a person to press a call button of an elevator to travel to their intended floor. For example, the buttons may be out of reach to individuals in a wheel-chair, the use of crutches or other aides may pose a problem, or decreased motor function may make it difficult to press the intended button. Others may be hesitant to press the buttons for sanitary reasons or may want the convenience to call a floor hands-free if they are holding items in both of their hands.
-
JP 2010 100 370 -
WO 2011 114 489 A1 relates to a guide device for an elevator. The guide device comprises a camera, which takes pictures around the entrance of the elevator. It is detected, whether or not a person has entered the elevator or not based on the pictures. -
WO 2005 56251 A1 - There may be a need for a hands-free and economic control method of an elevator system.
- Such a need may be met with the subject-matter of the independent claims. Advantageous embodiments are defined in the dependent claims. Ideas underlying embodiments of the present invention may be interpreted as being based, inter alia, on the following observations and recognitions.
- An aspect of the present invention relates to method for controlling an elevator system. The method may be automatically performed by a control system of the elevator system and/or may be implemented as a computer program. In general, an elevator system may be any device adapted for transporting persons and/or goods vertically with an elevator car. An elevator system may be installed in a building in an elevator shaft.
- According to an embodiment of the invention, the method comprises: detecting the presence of a person in front of a control panel of the elevator system with a presence detection sensor; in the case a person has been detected, detecting the presence of at least one eye of a person in front of the control panel from a video stream of a camera of the control panel; in the case, when the eye has been detected, determining a gaze point of the eye on the control panel from a video stream of the camera and/or a further camera, wherein a line of sight of the eye is determined from an image of the eye in the video stream and the gaze point is determined by intersecting the line of sight with a component of the control panel stored in a virtual layout of the control panel; determining a selected floor from the gaze point; and controlling the elevator system to move an elevator car to the selected floor.
- The control system of the elevator system may wait for a person to appear in front of a control panel. In this case, a camera may start to work and/or the video stream of the camera may be analysed, whether an eye of the person is visible for the camera or not. It may be possible that an eye detection module analyses the video stream and/or one or more images of the video stream. The eye detection module may be based on a neural network and/or machine learning and/or may have been trained to detect a portion of an image that contain an eye.
- When the eye is visible, the video stream may be analysed to determine a line of sight of the eye and/or whether the person is looking at the control panel. A gaze tracking module that analyses one or more video streams of one or more of the cameras may perform the determination of the line of sight. For example, from the portion of the image and/or video stream that has been detected, the gaze tracking module may determine reflections on the eye. The reflections may have been caused by infrared lights and/or by other light sources, such as an elevator car lighting. From the reflections an orientation of the eye and/or view direction may be determined. From the position of the eye and the orientation a line of sight of the eye may be determined.
- From the line of sight, a gaze point on the control panel may be determined, which also may be performed by the gaze tracking module. The position of the camera and/or of the control panel and/or optionally a virtual layout of the control panel may be stored in a controller, which performs the method. From this information, it may be determined at which part of the control panel the person is looking. For example, such a part may be a display and/or may be a button, such as a floor selection button, of the control panel. Thus, from the gaze point it may be determined to which floor the person intends to go. The elevator car then may be controlled to move to this floor.
- The virtual layout may comprise the positions and/or extensions of buttons on the control panel, the positions and/or extensions of visual control commands on a display of the control panel, etc.
- It also may be that a floor selection is cancelled by looking at a visual cue, such as a text, a symbol, etc. and then moving the gaze point to the previously selected floor. Such a visual cue also may be displayed as visual control command on a display.
- The implementation of an eye tracking system for calling floors may help facilitate the independence of disabled people as well as a convenience to others. With the method, an elevator floor destination may be selected requiring no tactile input. The method may use eye-tracking and gaze point detection algorithm in order that a person can send command prompts to the control panel hands-free.
- This may result in an increased efficiency and less time spent picking floors. There may be no need to request other persons to push a button. Disabled persons may use the elevator easily, for example wheel-chair users may select the floor hands-free. Furthermore, additional support may be provided to persons under situations of stress and emergency.
- According to an embodiment of the invention, the method further comprises: playing an audio prompt indicating the person to look at the control panel, when a person has been detected and no eye has been detected. In the case the controller is not able to detect an eye in the video stream of the camera, an audio prompt with instructions, how a floor can be selected with gaze tracking, may be output via a loudspeaker.
- According to an embodiment of the invention, the method further comprises: determining the presence of the eye with a second camera of the control panel, when no eye has been detected with the camera being a first camera. The control panel may comprise more than one camera, which is used for eye tracking. When the controller analyses the video stream of the first camera and is not able to detect an eye, it may switch to the video stream of another camera. The video stream from the second camera may be analysed for the presence of an eye and/or, when an eye has been detected for a line of sight and/or the gaze point. To this end, a position of the second camera relative to the control panel may be stored in the controller.
- According to an embodiment of the invention, the second camera is installed lower than the first camera. For example, the first camera (or the second camera) may be installed in a height adapted for eye tracking of a standing grown-up person. The second camera (or the first camera) may be installed in a height for eye tracking of a child and/or a person in a wheel-chair.
- According to an embodiment of the invention, the gaze point is determined with the second camera. It may be that the first camera is solely used for determining whether an eye is visible or not and the second camera is then used for eye tracking. For example, the first camera may be less power consuming as the second one and this may save power consumed by the control system.
- According to an embodiment of the invention, the method further comprises: displaying visual control commands on a display of the control panel. For example, when a person has been detected and/or when the eye has been detected, control commands that may be selected with the eye are displayed. Such control commands may comprise commands like "Do you need help? Yes/No", "Do you want to move up? Yes/No", etc. The control commands may provide possibilities to control the movement of the elevator car and/or the selection of a floor. In general, the display may show visual symbols and/or text as control commands. The display and/or screen may be integrated into the control panel or may be provided in the elevator car. It may allow a person to interact with the control system with his or her eye movement. The person may interact with control command prompts on the display screen in an emergency scenario and/or also may allow a person at an offsite location to control what is being displayed on the display.
- According to an embodiment of the invention, the method further comprises: selecting a control command by determining, whether the gaze point is on the control command. The selected floor then may be determined with the selected control command. For example, the control command may be a number representing a floor. When the gaze point stays on this number, the control system may decide that the respective floor has been selected.
- According to an embodiment of the invention, the method further comprises: selecting a button on the control panel by determining, whether the gaze point is on the button. The selected floor then may be determined from the selected button. It has to be noted that the same button also may be pressed for selecting the same floor. With the method, the floor may not solely be selected by pressing a button, but also by looking at the button. It also may be possible that other operations are imitated on the control panel, such as generating an emergency call, when the person looks at an emergency button.
- For example, to ensure that a person really has selected a specific button, it may be that the gaze point has to stay on the specific button for more than a specific time, such as three seconds. A button press on a special button of the control panel may be used to cancel the last stored floor call and/or any other operation.
- According to an embodiment of the invention, the presence detection sensor is a motion detection sensor. For example, the presence detection sensor may be an infrared sensor, an electromagnetic sensor, an ultrasonic sensor, etc. As the one or more cameras, the presence detection sensor may be integrated into the control panel. In particular, the presence detection sensor may be a sensor different from the camera providing the video stream for gate tracking. This may provide the ability to interface with a secondary sensor to control when the one or more cameras and/or other parts of the control system are active, off, in a power-saving state and/or any other mode of operation.
- It has to be noted that presence detection of a person in front of the control panel also may be performed by analysing the video stream from a camera, such as the first and/or second camera.
- A further aspect of the invention relates to a control system for an elevator system, which comprises a control panel and a controller adapted for performing the method as described in the above and in the following. The control panel may be installed in an elevator car and/or at a door of the elevator system. The controller may be part of the control panel. It also may be that the controller comprises several parts, for example a part in the control panel and a further part in the central controller of the elevator system, which, for example, may be installed near a drive of the elevator system. The control system may be adapted for controlling elevator calls that select the desired floor by tracking eye movements based on where and/or what a person is looking at.
- The controller may comprise one or more processors, which are adapted for performing the method, when a corresponding computer program is executed with them.
- According to an embodiment of the invention, the control panel comprises at least one camera adapted for eye tracking. The video stream from the at least one camera may be analysed to determine a line of sight of the eye. It also may be that the video stream of the at least one camera is presented to a central processing center to monitor for safety reasons. The central processing center may be connected to a control system of the elevator system by Internet and/or the video stream may be transmitted via Internet.
- According to an embodiment of the invention, the control panel comprises a first module and a second module. In this context, a module may be mechanically interconnected components that can be installed as one unit in the elevator car and/or other position. The first module of the control panel may comprise buttons, such as floor selection buttons, and a camera. The camera may be used for eye tracking. The second module of the control panel may comprise a display and a further camera. The display may be used for presenting control commands to a person in front of the control panel. The further camera alternately and/or additionally may be used for eye tracking. A video stream of the further camera may be transmitted to a central processing center for monitoring the place in front of the control panel.
- Alternatively, all components of the control panel as described in the above and in the following also may be provided as one module.
- According to an embodiment of the invention, the control panel comprises buttons for manually selecting a floor. These buttons may be part of the first module.
- According to an embodiment of the invention, the control panel comprises a display for displaying control commands. The display device also may be used as a display device for deaf users. For example, information on how to use the gaze tracking method may be displayed in text on the display. The display may be part of the second module.
- According to an embodiment of the invention, the control panel comprises a loudspeaker for outputting audio prompts. The loudspeaker also may offer hearable information for blind users. A loudspeaker may be part of the first module and/or the second module.
- According to an embodiment of the invention, the control panel comprises a presence detection sensor. The presence detection sensor may be part of the first module.
- A further aspect of the invention relates to an elevator system, which comprises an elevator car movable in an elevator shaft and a control system as described herein. The control panel of the control system may be installed in the elevator car. However, it also may be possible that the control panel is installed at a door of the elevator system for getting access to the elevator car.
- It has to be noted that features of the elevator system and the control system as described herein may be features of the method for controlling the elevator system, and vice versa.
- In the following, advantageous embodiments of the invention will be described with reference to the enclosed drawings. However, neither the drawings nor the description shall be interpreted as limiting the invention.
-
Fig. 1 schematically shows an elevator system according to an embodiment of the invention. -
Fig. 2 schematically shows a control panel for a control system according to an embodiment of the invention. -
Fig. 3 shows a flow diagram for a method for controlling an elevator system according to an embodiment of the invention. - The figures are only schematic and not to scale. Same reference signs refer to same or similar features.
-
Fig. 1 shows anelevator system 10 comprising anelevator car 12 movable in anelevator shaft 14 by adrive 16. Theelevator system 10 furthermore comprises a central controller 18 (which may be a part of thedrive 16 or at least arranged near the drive 16) for controlling thedrive 16 and further equipment of theelevator system 10. For example, thecentral controller 18 may also controlelevator doors 20. - The
central controller 18 may receive electronic control commands from acontrol panel 22 inside theelevator car 12. It also may be that here and in the following thecontrol panel 22 is installed outside of theelevator car 12, for example besides one of thedoors 20. Thecontrol panel 22 and thecentral controller 18 may be seen as acontrol system 24 of theelevator system 10. - The
control panel 22 comprises afirst module 26 and asecond module 28, which will be described in more detail with respect toFig. 2 . - The
first module 26 comprises floorselect buttons 30, anupper camera 32 arranged above thebuttons 30, alower camera 34 arranged below thebuttons 30 and apresence detection sensor 36. - The floor
select buttons 30 may be used for selecting a floor to which the elevator car should move. There may be abutton 30 for each floor. When a person pushes therespective button 30, a corresponding electronic command is sent via alocal controller 38 to thecentral controller 18, which then controls theelevator system 10 to move theelevator car 12 to the respective floor. Thelocal controller 38 may be part of thecontrol panel 22. - The
camera 32 may generate a video stream that may be analysed by thelocal controller 38 to determine face data and/or retinal data of a person. Also thecamera 34 may generate a video stream to determine face data and/or retinal data of a person. This may be performed additionally or alternatively with respect to the video stream of thecamera 32. Thecamera 34 may be a low-angle emergency and/ordisability camera 34, for example, for high-stress situations and/or for persons with eyes on a lower level, such as persons in a wheel-chair or children. - The
lower camera 34 may provide more accurate registering of information for a person in a wheelchair or of shorter stature. In addition, thecamera 34 may be seen as asecondary camera 34 and/or may be used instead of theprimary camera 32, when thesecondary camera 34 is more viable based on the position of the person in theelevator car 12. Additionally or alternatively, thecamera 32 and thecamera 34 may be used in conjunction with each other. - From the video stream(s), the
local controller 38 may determine a line of sight of an eye of a person and a gaze point of the person, in particular a gaze point on themodule 26 and/or themodule 28. - The
presence detection sensor 36 may be arranged above thebuttons 30 and/or thecamera 32. Thepresence detection sensor 36 may be adapted for detecting the presence of a person in front of thecontrol panel 22. For example, thepresence detection sensor 36 is adapted for sensing changes in infrared radiation and/or an ultrasonic sound, which may be caused by a human body in front of thecontrol panel 22. - The
module 28 may be arranged above themodule 26.Module 28 comprises adisplay 40 and one or morefurther cameras 42. Also the camera(s) 42 may generate a video stream, which is evaluated by thecontroller 38 for face tracking and/or gaze tracking data. For example, with thecamera 42, gaze tracking data relating to thedisplay 40 may be generated and analysed by thecontroller 38. - For additional support and/or emergency purposes the video stream of the
camera 42 may be transmitted to a central processing center, for example via thecontroller 38 to thecontroller 18, which may be connected to the central processing center via Internet. Thecamera 42 may be used for multiple purposes besides gaze detection including but not limited to in-car monitoring. - The
display 40 may be used for displaying text prompts or visual control commands 44, such as "Yes", "No", "Up", "Down", etc. For example, a selection of choice may be stated, such as "Yes" and "No". A text may be added such as "Do you need help?". When thedisplay 40 is not used for any other functions, it may serve as an emergency services device. -
Module 28 also may comprise a loudspeaker oraudio speaker 46, for example for prompting and interacting with a person inside theelevator car 12. -
Fig. 3 shows a flow diagram of a method that may be executed by thecontrol system 24. In the following, most of the control functions are described with respect to thelocal controller 38. However, it has to be understood, that these functions also may be performed by thecontroller 18 or by a combination of bothcontrollers - In step S10, only the
presence detection sensor 36 may be active, i.e. measurements may be performed with thepresence detection sensor 36 and evaluated with thelocal controller 38. Other components of thecontrol panel 22, such as thecameras display 40, may be inactive. - In Step S12, if the
presence detection sensor 36 does not detect a human presence, thecontrol system 24 continues in a passive state. Thecontrol system 24 returns to step S10 and thecomponents - In steps S14, when the presence of a person in front of a
control panel 22 of theelevator system 10 is detected, thecontrol system 24 is switched to an active state, which may mean that the eye-tracking functionality is powered on. For example, firstly thecamera 32 may be powered on, which then generates a video stream, which is analysed by thelocal controller 38. - In step S16, the
local controller 38 starts to detect, whether there is an eye visible in the video stream. Thelocal controller 38 may detect, whether or not human eyes are readable by scanning and analysing retinal data in the video stream. - For example, the
local controller 38 may comprise an eye detection module, which, for example based on a neural network and/or machine learning, has been trained to detect portion of images that contain an eye. - In general, the presence of an eye of a person in front of the
control panel 22 may be detected with one or more video streams from one or more of thecameras control panel 22. - For example, when no eye has been detected with the
first camera 32,camera 34 may be powered on and the video stream of thecamera 34 may be analysed. The presence of the eye may then be determined with thesecond camera 34. - In step S18, when no eye has been detected (either with one or with
more cameras local controller 38 plays an audio prompt indicating the person to look at thecontrol panel 22. - In step S20, when the eye has been detected, the video stream of the
respective camera first camera 32 and, if this is not possible, secondly with thesecond camera 34. - The
local controller 38 may comprise a gaze tracking module that analyses one or more video streams of one or more of the cameras. For example, from the portion of the image and/or video stream that has been detected in step S14, the gaze tracking module may determine reflections on the eye. The reflections may have been caused by infrared lights included into thecontrol panel 22 and/or by other light sources, such as an elevator car lighting. From the reflections an orientation of the eye and/or view direction may be determined. From the position of the eye and the orientation a line of sight of the eye may be determined. - The gaze point then may be determined by intersecting the line of sight with a component of the
control panel 22, which component is stored in a virtual layout of the control panel. The virtual layout may comprise the positions and/or extensions of the first and/orsecond module buttons 30, the positions and/or extensions of the visual control commands 44, etc. The virtual layout may be stored in thelocal controller 38 and/or may be updated by thelocal controller 38, for example when the visual control commands 44 change. - Additionally, when the eye has been detected, visual control commands 44 may be displayed on the
display 40 of thecontrol panel 22. - In step S22, the
local controller 38 detects, whether or not the person is looking at thecontrol panel 22 and/or at which component of thecontrol panel 22 the person is looking. - When the gaze point is not on the
control panel 22, thecontrol system 24 may return to step S20 and, for example, may output user instructions via thedisplay 40 and/or theloudspeaker 46. For example, if the eyes are not readable, an audio and or visual prompt may alert the person stating instructions on how to improve the chances of the person's eyes being readable. Similarly, if the person is not looking at thecontrol panel 22, additional user instructions may be given. - In step S24, if the gaze point is on the
control panel 22, thecontrol system 24 identifies, which floor is selected with the gaze point, i.e. a selected floor is determined from the gaze point. This may be done by determining, whether the person is looking at aspecific button 30 or with the visual control commands 44 on thedisplay 40. - For example, a
control command 44 may be selected, by determining, whether the gaze point is on thecontrol command 44. The selected floor then may be determined with the selectedcontrol command 44. For example, when the control command is "Up", the selected floor may be the next floor above a current floor. By looking longer at the control command "Up", the number of floors above the current floor may be increased. Analogously, with the control command "Down", a floor below the current floor may be selected. - Another possibility is that the person is looking at one of the
buttons 30. Abutton 30 on thecontrol panel 22 may be selected by determining, whether the gaze point is on thebutton 30. The selected floor then may be determined from the selectedbutton 30. - In both cases, a layout of the
control panel 22 may be stored in thelocal controller 38, which then can determine from the gaze point the component of thecontrol panel 22, the person is looking at. This layout may comprise the positions of thebuttons 30 and/or thedisplay 40. - In step S26, an electronic control command is sent to the
central controller 18, which floor has been selected. This electronic control command may be the same as the one, which is generated, when a corresponding floorselect button 30 is pushed. - In step S28, the
elevator system 10 controls thedrive 16 to move theelevator car 12 to the selected floor. Furthermore, other equipment, such as thedoors 20, may be controlled based on the electronic control command. - Finally, it should be noted that the term "comprising" does not exclude other elements or steps and the "a" or "an" does not exclude a plurality. Also elements described in association with different embodiments may be combined. It should also be noted that reference signs in the claims should not be construed as limiting the scope of the claims.
-
- 10
- elevator system
- 12
- elevator car
- 14
- elevator shaft
- 16
- drive
- 18
- central controller
- 20
- elevator door
- 22
- control panel
- 24
- control system
- 26
- first module
- 28
- second module
- 30
- floor select button
- 32
- first camera
- 34
- second camera
- 36
- presence detection sensor
- 38
- local controller
- 40
- display
- 42
- further camera
- 44
- visual control command
- 46
- loudspeaker
Claims (15)
- A method for controlling an elevator system (10), the method comprising:detecting the presence of a person in front of a control panel (22) of the elevator system (10) with a presence detection sensor (36);in the case a person has been detected, detecting the presence of at least one eye of a person in front of the control panel (22) from a video stream of a camera (32, 34, 42) of the control panel (22);in the case, when the eye has been detected, determining a gaze point of the eye on the control panel (22) from a video stream of the camera (32, 34, 42) and/or a further camera (32, 34, 42), wherein a line of sight of the eye is determined from an image of the eye in the video stream and the gaze point is determined by intersecting the line of sight with a component of the control panel stored in a virtual layout of the control panel (22);determining a selected floor from the gaze point;controlling the elevator system (10) to move an elevator car (12) to the selected floor.
- The method of claim 1, further comprising:
when a person has been detected and no eye has been detected, playing an audio prompt indicating the person to look at the control panel (22). - The method of claim 1, further comprising:
when no eye has been detected with the camera (32) being a first camera, determining the presence of the eye with a second camera (34) of the control panel (22). - The method of claim 3,
wherein the second camera (34) is installed lower as the first camera (32). - The method of claim 3 or 4,
wherein the gaze point is determined with the second camera (34). - The method of one of the previous claims, further comprising:displaying control commands (44) on a display (40) of the control panel (22);selecting a control command (44) by determining, whether the gaze point is on the control command (44);wherein the selected floor is determined with the selected control command (44).
- The method of one of the previous claims, further comprising:selecting a button (30) on the control panel (22) by determining, whether the gaze point is on the button (30);wherein the selected floor is determined from the selected button (30).
- The method of one of the previous claims,
wherein the presence detection sensor (36) is a motion detection sensor. - A control system (24) for an elevator system (10), the control system (24) comprising:a control panel (22);a controller (38, 18) adapted for performing the method of one of the previous claims.
- The control system (24) of claim 9,
wherein the control panel (24) comprises at least one camera (32, 34, 42) adapted for eye tracking. - The control system (24) of claim 9 or 10,
wherein the control panel (22) comprises a first module (26) and a second module (28);
wherein the first module (26) comprises buttons (30) and a camera (32, 34);
wherein the second module (28) comprises a display (40) and a camera (42). - The control system (24) of one of claims 9 to 11,
wherein the control panel (22) comprises buttons (30) for manually selecting a floor. - The control system (24) of one of claims 9 to 12,
wherein the control panel (22) comprises a display (40) for displaying control commands (44). - The control system (24) of one of claims 9 to 13,
wherein the control panel (22) comprises a loudspeaker (46) for outputting audio prompts; and/or
wherein the control panel (22) comprises a presence detection sensor (36). - An elevator system (10), comprising:an elevator car (12) movable in an elevator shaft (14);a control system (24) according to one of claims 9 to 14;wherein the control panel (22) of the control system (24) is installed in the elevator car (12).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18175194.2A EP3575257A1 (en) | 2018-05-30 | 2018-05-30 | Control of elevator with gaze tracking |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18175194.2A EP3575257A1 (en) | 2018-05-30 | 2018-05-30 | Control of elevator with gaze tracking |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3575257A1 true EP3575257A1 (en) | 2019-12-04 |
Family
ID=62492512
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18175194.2A Withdrawn EP3575257A1 (en) | 2018-05-30 | 2018-05-30 | Control of elevator with gaze tracking |
Country Status (1)
Country | Link |
---|---|
EP (1) | EP3575257A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022003231A1 (en) | 2020-06-29 | 2022-01-06 | Kone Corporation | Controlling of elevator system |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005056251A1 (en) | 2003-12-10 | 2005-06-23 | Koninklijke Philips Electronics N.V. | Shaving head with skin stretching member |
JP2007161420A (en) * | 2005-12-14 | 2007-06-28 | Hitachi Ltd | Call registering device of elevator |
JP2010100370A (en) | 2008-10-22 | 2010-05-06 | Hitachi Ltd | Operation input device for elevator and method therefor |
WO2011114489A1 (en) | 2010-03-18 | 2011-09-22 | 三菱電機株式会社 | Guide device for elevator |
US20150309570A1 (en) * | 2009-04-09 | 2015-10-29 | Dynavox Systems Llc | Eye tracking systems and methods with efficient text entry input features |
JP2017013984A (en) * | 2015-07-03 | 2017-01-19 | 株式会社日立製作所 | Elevator device |
-
2018
- 2018-05-30 EP EP18175194.2A patent/EP3575257A1/en not_active Withdrawn
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005056251A1 (en) | 2003-12-10 | 2005-06-23 | Koninklijke Philips Electronics N.V. | Shaving head with skin stretching member |
JP2007161420A (en) * | 2005-12-14 | 2007-06-28 | Hitachi Ltd | Call registering device of elevator |
JP2010100370A (en) | 2008-10-22 | 2010-05-06 | Hitachi Ltd | Operation input device for elevator and method therefor |
US20150309570A1 (en) * | 2009-04-09 | 2015-10-29 | Dynavox Systems Llc | Eye tracking systems and methods with efficient text entry input features |
WO2011114489A1 (en) | 2010-03-18 | 2011-09-22 | 三菱電機株式会社 | Guide device for elevator |
JP2017013984A (en) * | 2015-07-03 | 2017-01-19 | 株式会社日立製作所 | Elevator device |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022003231A1 (en) | 2020-06-29 | 2022-01-06 | Kone Corporation | Controlling of elevator system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2021200009B2 (en) | System and method for alternatively interacting with elevators | |
US8958910B2 (en) | Elevator system that autonomous mobile robot takes together with person | |
US11097924B2 (en) | Hand detection for elevator operation | |
JP5996725B1 (en) | Elevator control panel | |
JP5951834B1 (en) | elevator | |
KR101775735B1 (en) | Call input device of an elevator installation | |
KR20170030662A (en) | Method for operating a lift system, call input device, lift system comprising a call input device of this type and method for retrofitting a lift system with a call input device of this type | |
JP5074088B2 (en) | Elevator equipment | |
JP5550334B2 (en) | Elevator system and control method thereof | |
EP1633669B1 (en) | Elevator call button with tactile feedback | |
JP7294538B2 (en) | building traffic control system | |
EP3575257A1 (en) | Control of elevator with gaze tracking | |
JP6251638B2 (en) | Elevator system | |
JP2013159471A (en) | Secret operation elevator | |
Ekanayaka et al. | Elderly supportive intelligent wheelchair | |
CN109835784B (en) | Elevator system | |
JPWO2017195354A1 (en) | Guidance presentation device and guidance presentation method | |
JP7478690B2 (en) | Elevator | |
CN217577879U (en) | Elevator control system based on sole control and elevator | |
JP2011033837A (en) | Interaction support device, interaction support device, and program | |
JPH09255243A (en) | Voice-operated elevator | |
JP7518738B2 (en) | Elevator call operation control device and method | |
JP6643221B2 (en) | Elevator guidance device | |
JP2012197126A (en) | Call registering device for elevator | |
JP2013142026A (en) | Elevator display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20191216 |