WO2020157367A1 - User interface solution for elevator system - Google Patents

User interface solution for elevator system Download PDF

Info

Publication number
WO2020157367A1
WO2020157367A1 PCT/FI2019/050063 FI2019050063W WO2020157367A1 WO 2020157367 A1 WO2020157367 A1 WO 2020157367A1 FI 2019050063 W FI2019050063 W FI 2019050063W WO 2020157367 A1 WO2020157367 A1 WO 2020157367A1
Authority
WO
WIPO (PCT)
Prior art keywords
user interface
interface surface
control unit
user
proximity sensor
Prior art date
Application number
PCT/FI2019/050063
Other languages
French (fr)
Inventor
Jaana HYVÄRINEN
Jukka KORPIHETE
Original Assignee
Kone Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kone Corporation filed Critical Kone Corporation
Priority to PCT/FI2019/050063 priority Critical patent/WO2020157367A1/en
Publication of WO2020157367A1 publication Critical patent/WO2020157367A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • B66B1/46Adaptations of switches or switchgear
    • B66B1/461Adaptations of switches or switchgear characterised by their shape or profile
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • B66B1/46Adaptations of switches or switchgear
    • B66B1/467Adaptations of switches or switchgear characterised by their mounting position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • B66B1/46Adaptations of switches or switchgear
    • B66B1/468Call registering systems

Definitions

  • the invention concerns in general the technical field of elevators. More particularly, the invention concerns a user interface solution for the elevators.
  • the existing elevator systems have several user interfaces which enable interaction of users, such as passengers, with the elevator system.
  • the user interfaces are typically one residing in an elevator car and another residing at landing floor.
  • the first one is commonly known as a car operating panel whereas the latter is known as a hall operating panel.
  • the mentioned user interfaces may comprise input devices, such as buttons and similar, for requesting a service from the elevator system.
  • the requested service may e.g. refer to requesting a transportation from the elevator system, requesting opening/closing of doors, requesting a communication connection to contact center, indicating an emergency situation and so on.
  • the user interface solutions according to the prior art are based on the idea that they are fixedly positioned to predetermined positions. The positions are tried to be optimally selected for users of the elevator system. However, the selection of the position is based on some analysis based on statistics which means that at least some users, such as persons in wheelchairs, may experience difficulties in using the user interfaces of the elevator system. On the other hand, for example in a situation that an elevator car is heavily crowded it may turn out to be challenging to reach the user interface i.e. the car operating panel by at least some passengers in the elevator car and it may cause frustration and undesired user experience of the elevator system. On the other hand, a development in a field of electronics provide new approaches in an implementation of the user interfaces. Namely, so-called printed electronics enables a generation of large surfaces having capabilities to operate as the user interface for a plurality of application areas.
  • An objective of the invention is to present a method, a control unit, an elevator system and a computer program product for controlling of at least one user interface of an elevator system.
  • a method for an elevator system comprising: detecting a presence of an object in an operational area of at least one proximity sensor based on data received from the at least proximity sensor; and generating a signal for activating at least one user interface surface, the at least one user interface surface being activated is selected in accordance with the at least one proximity sensor from which the data generating a detection of the presence of the object in the operational area of the at least one proximity sensor is received.
  • the method may further comprise: detecting a user indication provided through the at least one user interface surface; and generating a signal causing an establishment of at least one I/O panel on the at least one user interface surface in accordance with the user indication provided through the at least one user interface surface.
  • An activation of the at least one user interface surface may be indicated with at least one of the following: visual signal, audible signal, haptic signal.
  • a detection of the user indication provided through the at least one user interface surface may be performed by: detecting of a touch on the at least one user interface surface, detecting a presence of a pointer within a distance threshold of the at least one proximity sensor.
  • a position of the at least one established I/O panel on the at least one user interface surface may be arranged to be relative to the position of the detection of the user indication.
  • the position of the at least one established I/O panel on the at least one user interface surface may correspond to the position of the detection of the user indication.
  • a control unit of an elevator system is provided, the control unit is configured to: detect a presence of an object in an operational area of at least one proximity sensor based on data received from the at least proximity sensor; and generate a signal for activating at least one user interface surface, the at least one user interface surface being activated is selected in accordance with the at least one proximity sensor from which the data generating a detection of the presence of the object in the operational area of the at least one proximity sensor is received.
  • the control unit may further be configured to: detect a user indication provided through the at least one user interface surface; and generate a signal causing an establishment of at least one I/O panel on the at least one user interface surface in accordance with the user indication provided through the at least one user interface surface.
  • the control unit may also be configured to generate a signal causing an indication of an activation of the at least one user interface surface with at least one of the following: visual signal, audible signal, haptic signal.
  • the control unit may be configured to perform a detection of the user indication from data provided through the at least one user interface surface by: detecting of a touch on the at least one user interface surface, detecting a presence of a pointer within a distance threshold of the at least one proximity sensor.
  • the control unit may be configured to determine a position of the at least one established I/O panel on the at least one user interface surface to be relative to the position of the detection of the user indication. For example, the control unit may be configured to determine the position of the at least one established I/O panel on the at least one user interface surface to correspond to the position of the detection of the user indication.
  • an elevator system comprising: at least one proximity sensor; at least one user interface surface; and a control unit as described above according to the second aspect.
  • the at least one user interface surface may be arranged to at least one of the following: an elevator car, a surface of a building where the elevator system resides.
  • a computer program which, when executed by at least one processor, cause a control unit of an elevator system to perform the method as described according to the first aspect.
  • the expression "a number of” refers herein to any positive integer starting from one, e.g. to one, two, or three.
  • the expression "a plurality of” refers herein to any positive integer starting from two, e.g. to two, three, or four.
  • FIGS 1 A and 1 B illustrate schematically non-limiting examples of an elevator system according to an embodiment of the invention.
  • FIG. 2 illustrates schematically an elevator car according to an embodiment of the invention.
  • Figures 3A and 3B illustrates schematically aspects relating to a method according to embodiments of the invention.
  • Figure 4 illustrates schematically further aspects relating to a method according to an embodiment of the invention.
  • Figure 5 illustrates schematically a user interface surface according to an embodiment of the invention.
  • Figure 6 illustrates schematically a control unit according to an embodiment of the invention.
  • Figures 1 A and 1 B illustrate schematically non-limiting examples of an elevator system in which user interface surface 130 may be arranged to one or more surfaces of a building, such as to walls or shaft doors 120, or to one or more surfaces of an elevator car 150, such as to side walls, to floor, to ceiling or to doors.
  • the previous are non-limiting examples of the surfaces into which the user interface surfaces may be implemented.
  • the user interface surface may be manufactured with so-called printed electronics which enable an integration of components having a plurality of functionalities on the same surface with a thin structure.
  • the user interface surface may be implemented so that it comprises one or more proximity sensors and components by means of which it may be implemented a display screen or a touch screen.
  • the display screen may be implemented with so- called electroluminescent panel manufactured with the printed electronics.
  • the user interface surface may comprise electronics detecting a touch, such as electronics detecting changes in resistance or in capacitance, as non-limiting examples.
  • the user interface surface may be equipped with electronics suitable for providing a detectable output by the user, such as a haptic output, a visual output or an audible output.
  • Figures 1 A and 1 B also illustrate schematically a control unit 1 10 arranged to control at least in part at least some entities of the elevator system.
  • the entities may e.g. be one or more of the user interface surfaces 130.
  • the communication between the control unit 1 10 and the entities may be implemented in a wired manner or wirelessly at least in part.
  • FIGS 1 A and 1 B it is illustrated an aspect of the present invention in which an I/O panel 140 may be established on the user interface surface 130 to a position on at least one user interface surface 130 in accordance with information obtained in a manner as will be described.
  • the establishment of the I/O panel 140 refers to, but is not limited to, an arrangement by means of which the I/O panel 150 may be visualized i.e. displayed on the user interface surface 130.
  • FIG. 2 it is schematically illustrated an elevator car 150 into which it is arranged at least one user interface surface 130 comprising at least one proximity sensor 210.
  • the proximity sensor 210 is configured to detect a presence of an object without a physical contact. This may be achieved by emitting electromagnetic radiation and detecting changes in the emitted radiation i.e. in an electromagnetic field or a return signal. Flence, the emitted radiation forms a beam 220 of the proximity sensor 210 in which detections may be done in accordance with a presence of an object in the beam, i.e. in a detection volume of the proximity sensor.
  • a type of the proximity sensor 210 is not limited in the context of the present invention as such.
  • the proximity sensors 210 whose operation is based on capacitive coupling are especially applicable in the context of elevator systems since they may detect living objects as well as their sensing range is enough for the application area of the elevator systems.
  • the number of the proximity sensors 210 is not limited anyhow and it may be adjusted according to an implementation of the invention.
  • the elevator system may be arranged to operate so that the user interface surfaces 130 may be set to an active and to an inactive mode as will be described.
  • the active and inactive modes may especially refer to a feature of establishing the I/O panel on the user interface surface 130.
  • FIG. 3A a method according to an embodiment of the present invention is schematically illustrated.
  • the method steps of the method according to the non-limiting embodiment of the invention are discussed step-by-step:
  • step 310
  • step 310 it may be detected that an object, such as a passenger, is present in a vicinity of at least one proximity sensor 210 i.e. in an operational area of the at least one proximity sensor 210 causing a detection.
  • This may correspond to a situation that the object has entered in the elevator car 150 comprising the user interface surface 130 equipped with at least one proximity sensor 210 or the object resides close to such a user interface surface 130 implemented on a wall of a building where the elevator system resides, e.g. in a hall of the building.
  • the detection may occur so that a control unit 1 10 configured to receive sensor data from the number of proximity sensors 210 may trigger a detection when a value determinable from the at least one proximity sensor data meets (e.g. exceeds or is below) a predetermined limit, such as a so-called specific distance threshold.
  • a control unit 1 10 configured to receive sensor data from the number of proximity sensors 210 may trigger a detection when a value determinable from the at least one proximity sensor data meets (e.g. exceeds or is below) a predetermined limit, such as a so-called specific distance threshold.
  • the control unit 1 10 may be configured to generate a signal causing an activation of at least one user interface surface 130.
  • the activation of the user interface surface 130 may refer, but is not limited to, to a generation of an indication which user interface surface 130, or even a portion of the user interface surface 130, is activated.
  • the indication may e.g. be implemented with visual signal, audible signal or haptic signal, or with any combination of these non-limiting examples.
  • the user interface surface 130 to be activated may correspond to at least one certain proximity sensor 210 causing the detection of the object and is, thus, activated in accordance with the at least one proximity sensor 210, which generated the detection of the presence of the object.
  • the at least one proximity sensor 210 may e.g. reside on the same user interface surface 130 which is activated or on another user interface surface 130.
  • the visual signal indicating the activated user interface surface 130 may e.g. be a lighting of the surface in any manner or a provision of textual or image content on the user interface surface 130.
  • the audible signal may e.g. be generated with a loudspeaker implemented on the surface.
  • the haptic signal may e.g. be generated with so-called non-contact haptic technology, such as with ultrasound e.g. generated with applicable loudspeaker implemented on the surface.
  • the above given options of the applicable signals for indicating the activated user interface surface 130 are non-limiting examples and other forms of the indications may be used.
  • One aim of the indication may be to inform the object, such as a passenger of the elevator system, on the at least one user interface surface 130 which is/are activated to serve the object.
  • the serving may refer to an interaction between the object, such as the passenger, and the elevator system causing the elevator system to operate as the object is willing to.
  • At least one user interface surface 130 may be activated for further use as will be described.
  • a user indication in response to the activation of the user interface surface 130.
  • the user indication may cause a generation 340 of the I/O panel 140 on the user interface surface 130 operating as a display.
  • a position of the I/O panel 140 may be relative to the position of the user indication detected by the user interface surface 130.
  • the I/O panel 140 having e.g. a predetermined size may be centralized to the position of the detected user indication.
  • the I/O panel 140 may be positioned in any other manner with respect to the position of the detected user indication, such as within a distance of the position of the detected user indication.
  • the I/O panel 140 may be displayed on the user interface surface 130 as a virtual I/O device activated to receive an input from the user.
  • the detection of the user indication 330 may be arranged in accordance on the type of the user interface surface 130.
  • the user interface surface 130 when activated as described, may implement a function of a touch screen, i.e. at least a portion of the surface is manufactured with electronics suitable to operate as the touch screen, and in response to a detection of a touch, e.g. with a finger or any other pointer, the I/O panel 140 may be displayed in a position relative to the position of the touch.
  • the user interface surface 130 may be equipped with a plurality of proximity sensors, which may be the same as used in the first detection (step 310) or other proximity sensors having a different distance threshold, or activation distance, to the proximity sensors 210.
  • the distance threshold for the other proximity sensors applied in the detection of the user indication may be such that they generate the detection when an object, such as a finger, is close to the surface, such as within several centimeters, for example.
  • the other proximity sensors applied in the detection of the user indication 330 may be arranged as a proximity sensor array in order to enable an accurate detection which meets requirements in the application area.
  • the I/O panel 140 may be generated 340 in the same manner as already described.
  • the solution according to an embodiment of the invention enable a provision of an instruction to the elevator system in a sophisticated manner. Namely, in response to the generation of the I/O panel 140 visually on the user interface surface 130 with display electronics implemented on the surface it may be detected if a user, cf. the object, provides an input 410 with the I/O panel 140.
  • the elevator system may provide a plurality of options to be selected by providing an input through the I/O panel 140 for instructing the elevator system to operate accordingly.
  • the I/O panel may provide options for giving a destination call by the user of the elevator system.
  • the user interface surface 130 may be arranged to generate a signal corresponding the user selection among selectable objects, or options, displayed with the I/O panel.
  • the signal representing the user input, and thus the selection, may be generated in accordance to the detection of the input in the same manner as already described.
  • the signal may e.g. comprise data representing the position of detection of the user input, the user input being a touch or a bringing of a pointer, such as a finger or any other applicable pointer, close to the surface.
  • the control unit may be arranged to determine on the basis of the position information the input the user intends by comparing the position of the detection with the position of the I/O panel 140 which is known by the control unit.
  • control unit may be arranged to be aware of positions of different input areas of the I/O panel 140 for giving distinguishing features. For example, different areas may be defined to correspond different destination floors, or other instructions, on the I/O panel 140 and, hence, a desired instruction may be distinguished from other instructions.
  • the above described arrangement of determining different instructions corresponding to input detected with the input device, such as I/O panel 140, may utilize any known mechanism to implement such a detection.
  • the control unit 1 10 in response to a determination of the input and what is intended to be indicated with that in the elevator system, the control unit 1 10 may be arranged to generate 420 a control signal carrying information representing an instruction to one or more entities of the elevator system for completing the request indicated by the user.
  • the control unit 1 10 may be configured to generate a control signal to an elevator drive for causing the elevator car to move from its current position to the indicated destination floor.
  • the control unit 1 10 may be arranged to generate a control signal, or control signals, to the corresponding door motors, for example.
  • at least one user interface surface 130 may be activated in response to a detection of an object 310.
  • Figure 5 illustrates schematically an example of the user interface surface 130 according to an embodiment of the invention into which a plurality of functionalities is integrated.
  • the user interface surface 130 may be manufactured so that the whole surface is equipped with first electronics 510 suitable to be used for displaying patterns.
  • the first electronics 510 may e.g. refer to LEDs (Light Emitting Diode), or similar.
  • second electronics 520 may be implemented on the user interface surface 130 which comprise detection elements for receiving input from the user e.g. in the manner as discussed in the context of Figure 3B (e.g. steps 330 and 340).
  • On the surface 130 it may be implemented electronics, such as proximity sensors 210, for detecting in an object enters in a vicinity of the user interface surface 130 triggering the method as described.
  • FIG. 5 it is schematically illustrated an I/O panel 140 visualized by means of the first electronics 510.
  • the indication, or the input may be detected by the second electronics 520 and e.g. a position of the detection may be provided from the user interface surface 130 to the control unit 1 10.
  • the control unit 1 10 may determine an instruction corresponding to the position of the input by being aware what is displayed as the I/O panel to the user.
  • the user interface surface 130 may also comprise other electronics providing further functionalities, such as for providing audible signal or haptic signal.
  • the integrated electronics manufactured e.g. by means of printing i.e. printed electronics
  • FIG. 6 schematically illustrates a control unit 1 10 according to an embodiment of the invention.
  • the control unit 1 10 may comprise a processing unit 610, a memory 620 and a communication interface 630 among other entities.
  • the processing unit 610 may comprise one or more processors arranged to implement one or more tasks for implementing at least part of the method steps as described.
  • the processing unit 610 may be arranged to control an operation of a number of user interface surfaces 130 and any other entities of the present invention in the manner as described.
  • the memory 620 may be arranged to store computer program code which, when executed by the processing unit 610, cause the control unit 1 10 to operate as described.
  • the memory 620 may be arranged to store, as described, the reference value, and any other data.
  • the communication interface 630 may be arranged to implement, e.g. under control of the processing unit 610, one or more communication protocols enabling the communication with external entities as described.
  • the communication interface may comprise necessary hardware and software components for enabling e.g. wireless communication and/or communication in a wired manner.
  • the sensors are arranged to detect an object in order to generate a signal on a detection, if any.
  • the object causing the detection, as well as the detection mechanism itself, may be selected in accordance of the need.
  • the detection may occur if it is detected that a person or any similar, as a wheel chair, enters in an operational area of the sensor(s) in question.
  • the object to be detected e.g. in step 330, may be the same as in the step 310, such as a body of the person, or it may e.g. be a smaller object, such as a hand or a finger or any similar pointer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Elevator Control (AREA)
  • Indicating And Signalling Devices For Elevators (AREA)

Abstract

The invention relates to a method for an elevator system, the method comprising: detecting (310) a presence of an object in an operational area of at least one proximity sensor (210) based on data received from the at least proximity sensor (210); and generating a signal for activating (320) at least one user interface surface (130), the at least one user interface surface (130) being activated is selected in accordance with the at least one proximity sensor (210) from which the data generating a detection of the presence of the object in the operational area of the at least one proximity sensor (210) is received. The invention also relates to a control unit, an elevator system and a computer program product.

Description

User interface solution for elevator system
TECHNICAL FIELD
The invention concerns in general the technical field of elevators. More particularly, the invention concerns a user interface solution for the elevators.
BACKGROUND
The existing elevator systems have several user interfaces which enable interaction of users, such as passengers, with the elevator system. The user interfaces are typically one residing in an elevator car and another residing at landing floor. The first one is commonly known as a car operating panel whereas the latter is known as a hall operating panel. The mentioned user interfaces may comprise input devices, such as buttons and similar, for requesting a service from the elevator system. The requested service may e.g. refer to requesting a transportation from the elevator system, requesting opening/closing of doors, requesting a communication connection to contact center, indicating an emergency situation and so on.
The user interface solutions according to the prior art are based on the idea that they are fixedly positioned to predetermined positions. The positions are tried to be optimally selected for users of the elevator system. However, the selection of the position is based on some analysis based on statistics which means that at least some users, such as persons in wheelchairs, may experience difficulties in using the user interfaces of the elevator system. On the other hand, for example in a situation that an elevator car is heavily crowded it may turn out to be challenging to reach the user interface i.e. the car operating panel by at least some passengers in the elevator car and it may cause frustration and undesired user experience of the elevator system. On the other hand, a development in a field of electronics provide new approaches in an implementation of the user interfaces. Namely, so-called printed electronics enables a generation of large surfaces having capabilities to operate as the user interface for a plurality of application areas. SUMMARY
The following presents a simplified summary in order to provide basic understanding of some aspects of various invention embodiments. The summary is not an extensive overview of the invention. It is neither intended to identify key or critical elements of the invention nor to delineate the scope of the invention. The following summary merely presents some concepts of the invention in a simplified form as a prelude to a more detailed description of exemplifying embodiments of the invention.
An objective of the invention is to present a method, a control unit, an elevator system and a computer program product for controlling of at least one user interface of an elevator system.
The objectives of the invention are reached by a method, a control unit, an elevator system and a computer program product for controlling of at least one user interface of an elevator system as defined by the respective independent claims. According to a first aspect, a method for an elevator system is provide, the method comprising: detecting a presence of an object in an operational area of at least one proximity sensor based on data received from the at least proximity sensor; and generating a signal for activating at least one user interface surface, the at least one user interface surface being activated is selected in accordance with the at least one proximity sensor from which the data generating a detection of the presence of the object in the operational area of the at least one proximity sensor is received. The method may further comprise: detecting a user indication provided through the at least one user interface surface; and generating a signal causing an establishment of at least one I/O panel on the at least one user interface surface in accordance with the user indication provided through the at least one user interface surface.
An activation of the at least one user interface surface may be indicated with at least one of the following: visual signal, audible signal, haptic signal.
A detection of the user indication provided through the at least one user interface surface may be performed by: detecting of a touch on the at least one user interface surface, detecting a presence of a pointer within a distance threshold of the at least one proximity sensor.
A position of the at least one established I/O panel on the at least one user interface surface may be arranged to be relative to the position of the detection of the user indication. For example, the position of the at least one established I/O panel on the at least one user interface surface may correspond to the position of the detection of the user indication.
According to a second aspect, a control unit of an elevator system is provided, the control unit is configured to: detect a presence of an object in an operational area of at least one proximity sensor based on data received from the at least proximity sensor; and generate a signal for activating at least one user interface surface, the at least one user interface surface being activated is selected in accordance with the at least one proximity sensor from which the data generating a detection of the presence of the object in the operational area of the at least one proximity sensor is received. The control unit may further be configured to: detect a user indication provided through the at least one user interface surface; and generate a signal causing an establishment of at least one I/O panel on the at least one user interface surface in accordance with the user indication provided through the at least one user interface surface.
The control unit may also be configured to generate a signal causing an indication of an activation of the at least one user interface surface with at least one of the following: visual signal, audible signal, haptic signal.
The control unit may be configured to perform a detection of the user indication from data provided through the at least one user interface surface by: detecting of a touch on the at least one user interface surface, detecting a presence of a pointer within a distance threshold of the at least one proximity sensor. The control unit may be configured to determine a position of the at least one established I/O panel on the at least one user interface surface to be relative to the position of the detection of the user indication. For example, the control unit may be configured to determine the position of the at least one established I/O panel on the at least one user interface surface to correspond to the position of the detection of the user indication.
According to a third aspect, an elevator system is provided, the elevator system comprising: at least one proximity sensor; at least one user interface surface; and a control unit as described above according to the second aspect.
The at least one user interface surface may be arranged to at least one of the following: an elevator car, a surface of a building where the elevator system resides.
According to fourth aspect, a computer program is provided, which, when executed by at least one processor, cause a control unit of an elevator system to perform the method as described according to the first aspect. The expression "a number of” refers herein to any positive integer starting from one, e.g. to one, two, or three. The expression "a plurality of” refers herein to any positive integer starting from two, e.g. to two, three, or four.
Various exemplifying and non-limiting embodiments of the invention both as to constructions and to methods of operation, together with additional objects and advantages thereof, will be best understood from the following description of specific exemplifying and non-limiting embodiments when read in connection with the accompanying drawings.
The verbs“to comprise” and“to include” are used in this document as open limitations that neither exclude nor require the existence of unrecited features. The features recited in dependent claims are mutually freely combinable unless otherwise explicitly stated. Furthermore, it is to be understood that the use of“a” or“an”, i.e. a singular form, throughout this document does not exclude a plurality.
BRIEF DESCRIPTION OF FIGURES The embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings.
Figures 1 A and 1 B illustrate schematically non-limiting examples of an elevator system according to an embodiment of the invention.
Figure 2 illustrates schematically an elevator car according to an embodiment of the invention.
Figures 3A and 3B illustrates schematically aspects relating to a method according to embodiments of the invention.
Figure 4 illustrates schematically further aspects relating to a method according to an embodiment of the invention. Figure 5 illustrates schematically a user interface surface according to an embodiment of the invention. Figure 6 illustrates schematically a control unit according to an embodiment of the invention.
DESCRIPTION OF THE EXEMPLIFYING EMBODIMENTS
The specific examples provided in the description given below should not be construed as limiting the scope and/or the applicability of the appended claims. Lists and groups of examples provided in the description given below are not exhaustive unless otherwise explicitly stated.
At least some aspects of embodiments according to the present invention may be described, at least in part, by referring to Figures 1 A and 1 B. Figures 1 A and 1 B illustrate schematically non-limiting examples of an elevator system in which user interface surface 130 may be arranged to one or more surfaces of a building, such as to walls or shaft doors 120, or to one or more surfaces of an elevator car 150, such as to side walls, to floor, to ceiling or to doors. The previous are non-limiting examples of the surfaces into which the user interface surfaces may be implemented. Advantageously, the user interface surface may be manufactured with so-called printed electronics which enable an integration of components having a plurality of functionalities on the same surface with a thin structure. For the purpose of the present invention the user interface surface may be implemented so that it comprises one or more proximity sensors and components by means of which it may be implemented a display screen or a touch screen. For example, the display screen may be implemented with so- called electroluminescent panel manufactured with the printed electronics. Moreover, according to an embodiment of the invention the user interface surface may comprise electronics detecting a touch, such as electronics detecting changes in resistance or in capacitance, as non-limiting examples. Moreover, according to some embodiments the user interface surface may be equipped with electronics suitable for providing a detectable output by the user, such as a haptic output, a visual output or an audible output. Figures 1 A and 1 B also illustrate schematically a control unit 1 10 arranged to control at least in part at least some entities of the elevator system. The entities may e.g. be one or more of the user interface surfaces 130. The communication between the control unit 1 10 and the entities may be implemented in a wired manner or wirelessly at least in part.
In Figures 1 A and 1 B it is illustrated an aspect of the present invention in which an I/O panel 140 may be established on the user interface surface 130 to a position on at least one user interface surface 130 in accordance with information obtained in a manner as will be described. The establishment of the I/O panel 140 refers to, but is not limited to, an arrangement by means of which the I/O panel 150 may be visualized i.e. displayed on the user interface surface 130.
For describing at least some aspects of the present invention it is referred to Figure 2 in which it is schematically illustrated an elevator car 150 into which it is arranged at least one user interface surface 130 comprising at least one proximity sensor 210. The proximity sensor 210 is configured to detect a presence of an object without a physical contact. This may be achieved by emitting electromagnetic radiation and detecting changes in the emitted radiation i.e. in an electromagnetic field or a return signal. Flence, the emitted radiation forms a beam 220 of the proximity sensor 210 in which detections may be done in accordance with a presence of an object in the beam, i.e. in a detection volume of the proximity sensor. A type of the proximity sensor 210 is not limited in the context of the present invention as such. The proximity sensors 210 whose operation is based on capacitive coupling are especially applicable in the context of elevator systems since they may detect living objects as well as their sensing range is enough for the application area of the elevator systems. For sake of clarity the number of the proximity sensors 210 is not limited anyhow and it may be adjusted according to an implementation of the invention.
According to the present invention the elevator system may be arranged to operate so that the user interface surfaces 130 may be set to an active and to an inactive mode as will be described. The active and inactive modes may especially refer to a feature of establishing the I/O panel on the user interface surface 130.
Now, at least some aspects of the present invention may be described by referring to Figure 3A in which a method according to an embodiment of the present invention is schematically illustrated. In the following the method steps of the method according to the non-limiting embodiment of the invention are discussed step-by-step:
Regarding step 310:
In the step 310 it may be detected that an object, such as a passenger, is present in a vicinity of at least one proximity sensor 210 i.e. in an operational area of the at least one proximity sensor 210 causing a detection. This may correspond to a situation that the object has entered in the elevator car 150 comprising the user interface surface 130 equipped with at least one proximity sensor 210 or the object resides close to such a user interface surface 130 implemented on a wall of a building where the elevator system resides, e.g. in a hall of the building.
In other words, the detection may occur so that a control unit 1 10 configured to receive sensor data from the number of proximity sensors 210 may trigger a detection when a value determinable from the at least one proximity sensor data meets (e.g. exceeds or is below) a predetermined limit, such as a so-called specific distance threshold.
Regarding step 320:
In response to the detection performed by the at least one proximity sensor 210 the control unit 1 10 may be configured to generate a signal causing an activation of at least one user interface surface 130. The activation of the user interface surface 130 may refer, but is not limited to, to a generation of an indication which user interface surface 130, or even a portion of the user interface surface 130, is activated. The indication may e.g. be implemented with visual signal, audible signal or haptic signal, or with any combination of these non-limiting examples. For example, the user interface surface 130 to be activated may correspond to at least one certain proximity sensor 210 causing the detection of the object and is, thus, activated in accordance with the at least one proximity sensor 210, which generated the detection of the presence of the object. In other words, the at least one proximity sensor 210 may e.g. reside on the same user interface surface 130 which is activated or on another user interface surface 130. The visual signal indicating the activated user interface surface 130 may e.g. be a lighting of the surface in any manner or a provision of textual or image content on the user interface surface 130. The audible signal may e.g. be generated with a loudspeaker implemented on the surface. Moreover, the haptic signal may e.g. be generated with so-called non-contact haptic technology, such as with ultrasound e.g. generated with applicable loudspeaker implemented on the surface. The above given options of the applicable signals for indicating the activated user interface surface 130 are non-limiting examples and other forms of the indications may be used.
One aim of the indication may be to inform the object, such as a passenger of the elevator system, on the at least one user interface surface 130 which is/are activated to serve the object. The serving may refer to an interaction between the object, such as the passenger, and the elevator system causing the elevator system to operate as the object is willing to.
In the manner as described at least one user interface surface 130 may be activated for further use as will be described.
Next, an embodiment of the invention is described by referring to Figure 3B illustrating schematically further aspects of a method according to the embodiment of the invention.
Regarding steps 330 and 340:
According to an embodiment of the present invention in response to the activation of the user interface surface 130 a user indication may be detected 330 which is given through the user interface surface 130. The user indication may cause a generation 340 of the I/O panel 140 on the user interface surface 130 operating as a display. A position of the I/O panel 140 may be relative to the position of the user indication detected by the user interface surface 130. For example, it may be arranged so that the I/O panel 140 having e.g. a predetermined size may be centralized to the position of the detected user indication. Alternatively, the I/O panel 140 may be positioned in any other manner with respect to the position of the detected user indication, such as within a distance of the position of the detected user indication. As said, the I/O panel 140 may be displayed on the user interface surface 130 as a virtual I/O device activated to receive an input from the user.
The detection of the user indication 330 may be arranged in accordance on the type of the user interface surface 130. According to an embodiment of the invention the user interface surface 130, when activated as described, may implement a function of a touch screen, i.e. at least a portion of the surface is manufactured with electronics suitable to operate as the touch screen, and in response to a detection of a touch, e.g. with a finger or any other pointer, the I/O panel 140 may be displayed in a position relative to the position of the touch. According to another embodiment of the invention the user interface surface 130 may be equipped with a plurality of proximity sensors, which may be the same as used in the first detection (step 310) or other proximity sensors having a different distance threshold, or activation distance, to the proximity sensors 210. The distance threshold for the other proximity sensors applied in the detection of the user indication may be such that they generate the detection when an object, such as a finger, is close to the surface, such as within several centimeters, for example. The other proximity sensors applied in the detection of the user indication 330 may be arranged as a proximity sensor array in order to enable an accurate detection which meets requirements in the application area. Again, when the user of the elevator system provides the user indication e.g. by bringing a finger close enough to the user interface surface 130 to be detected by at least one proximity sensor arranged to perform the task, the I/O panel 140 may be generated 340 in the same manner as already described.
Next some still further aspects relating to the solution according to an embodiment of the present invention are disclosed by referring to Figure 4. Regarding steps 410 and 420:
As already referred in the description above the solution according to an embodiment of the invention enable a provision of an instruction to the elevator system in a sophisticated manner. Namely, in response to the generation of the I/O panel 140 visually on the user interface surface 130 with display electronics implemented on the surface it may be detected if a user, cf. the object, provides an input 410 with the I/O panel 140. In other words, the elevator system may provide a plurality of options to be selected by providing an input through the I/O panel 140 for instructing the elevator system to operate accordingly. For example, the I/O panel may provide options for giving a destination call by the user of the elevator system. Flence, the user interface surface 130 may be arranged to generate a signal corresponding the user selection among selectable objects, or options, displayed with the I/O panel. The signal representing the user input, and thus the selection, may be generated in accordance to the detection of the input in the same manner as already described. The signal may e.g. comprise data representing the position of detection of the user input, the user input being a touch or a bringing of a pointer, such as a finger or any other applicable pointer, close to the surface. According to the embodiment of the invention the control unit may be arranged to determine on the basis of the position information the input the user intends by comparing the position of the detection with the position of the I/O panel 140 which is known by the control unit. Moreover, the control unit may be arranged to be aware of positions of different input areas of the I/O panel 140 for giving distinguishing features. For example, different areas may be defined to correspond different destination floors, or other instructions, on the I/O panel 140 and, hence, a desired instruction may be distinguished from other instructions. The above described arrangement of determining different instructions corresponding to input detected with the input device, such as I/O panel 140, may utilize any known mechanism to implement such a detection. Furthermore, in response to a determination of the input and what is intended to be indicated with that in the elevator system, the control unit 1 10 may be arranged to generate 420 a control signal carrying information representing an instruction to one or more entities of the elevator system for completing the request indicated by the user. For example, if the input provided by the user of the elevator system indicates a destination floor, the control unit 1 10 may be configured to generate a control signal to an elevator drive for causing the elevator car to move from its current position to the indicated destination floor. On the other hand, if the input indicates that the user is willing to open the doors of the elevator car and/or shaft doors, the control unit 1 10 may be arranged to generate a control signal, or control signals, to the corresponding door motors, for example. As discussed above in the step 320 at least one user interface surface 130 may be activated in response to a detection of an object 310. This kind of approach is important especially in an elevator environment because passenger traffic occurs all the time in the elevator environment and there is need to manage the user interface surfaces 130 so that a number of false operations may be minimized and a possibility to provide an instruction to the elevator system is given to a correct user, such as to a new passenger in the elevator car 150. Flence, an activation of at least one assumedly optimal user interface surface 130 may be performed and in that manner to manage people flow in the elevator system in an efficient manner. As is derivable from the description above the present invention is, at least to some extent, based on utilizing so-called integrated electronics solution in the context of conveyor systems, such as elevator systems. By means of the integrated electronics a user interface surface 130 may be manufactured with necessary functionalities. Figure 5 illustrates schematically an example of the user interface surface 130 according to an embodiment of the invention into which a plurality of functionalities is integrated. For example, the user interface surface 130 may be manufactured so that the whole surface is equipped with first electronics 510 suitable to be used for displaying patterns. The first electronics 510 may e.g. refer to LEDs (Light Emitting Diode), or similar. Moreover, second electronics 520 may be implemented on the user interface surface 130 which comprise detection elements for receiving input from the user e.g. in the manner as discussed in the context of Figure 3B (e.g. steps 330 and 340). On the surface 130 it may be implemented electronics, such as proximity sensors 210, for detecting in an object enters in a vicinity of the user interface surface 130 triggering the method as described. In Figure 5 it is schematically illustrated an I/O panel 140 visualized by means of the first electronics 510. Thus, when the user of the elevator system provides an indication through the virtual I/O panel 140, the indication, or the input, may be detected by the second electronics 520 and e.g. a position of the detection may be provided from the user interface surface 130 to the control unit 1 10. The control unit 1 10 may determine an instruction corresponding to the position of the input by being aware what is displayed as the I/O panel to the user. The user interface surface 130 may also comprise other electronics providing further functionalities, such as for providing audible signal or haptic signal. The integrated electronics manufactured e.g. by means of printing (i.e. printed electronics) provides a sophisticated way to implement overlapping functionalities with different kinds of electronic components.
Figure 6 schematically illustrates a control unit 1 10 according to an embodiment of the invention. The control unit 1 10 may comprise a processing unit 610, a memory 620 and a communication interface 630 among other entities. The processing unit 610, in turn, may comprise one or more processors arranged to implement one or more tasks for implementing at least part of the method steps as described. For example, the processing unit 610 may be arranged to control an operation of a number of user interface surfaces 130 and any other entities of the present invention in the manner as described. The memory 620 may be arranged to store computer program code which, when executed by the processing unit 610, cause the control unit 1 10 to operate as described. Moreover, the memory 620 may be arranged to store, as described, the reference value, and any other data. The communication interface 630 may be arranged to implement, e.g. under control of the processing unit 610, one or more communication protocols enabling the communication with external entities as described. The communication interface may comprise necessary hardware and software components for enabling e.g. wireless communication and/or communication in a wired manner.
In the description of the solution according to the present invention it is discussed that the sensors are arranged to detect an object in order to generate a signal on a detection, if any. The object causing the detection, as well as the detection mechanism itself, may be selected in accordance of the need. For example, in step 310 of the method the detection may occur if it is detected that a person or any similar, as a wheel chair, enters in an operational area of the sensor(s) in question. On the other hand, the object to be detected e.g. in step 330, may be the same as in the step 310, such as a body of the person, or it may e.g. be a smaller object, such as a hand or a finger or any similar pointer.
The specific examples provided in the description given above should not be construed as limiting the applicability and/or the interpretation of the appended claims. Lists and groups of examples provided in the description given above are not exhaustive unless otherwise explicitly stated.

Claims

WHAT IS CLAIMED IS:
1 . A method for an elevator system, the method comprising: detecting (310) a presence of an object in an operational area of at least one proximity sensor (210) based on data received from the at least proximity sensor (210), and generating a signal for activating (320) at least one user interface surface (130), the at least one user interface surface (130) being activated is selected in accordance with the at least one proximity sensor (210) from which the data generating a detection of the presence of the object in the operational area of the at least one proximity sensor (210) is received.
2. The method of claim 1 , the method further comprising: detecting (330) a user indication provided through the at least one user interface surface (130), and generating (340) a signal causing an establishment of at least one I/O panel (140) on the at least one user interface surface (130) in accordance with the user indication provided through the at least one user interface surface (130).
3. The method of any of the preceding claims, wherein an activation of the at least one user interface surface (130) is indicated with at least one of the following: visual signal, audible signal, haptic signal.
4. The method of any of the preceding claims, wherein a detection of the user indication provided through the at least one user interface surface (130) is performed by: detecting of a touch on the at least one user interface surface (130), detecting a presence of a pointer within a distance threshold of the at least one proximity sensor (210).
5. The method of any of the preceding claims, wherein a position of the at least one established I/O panel (140) on the at least one user interface surface (130) is arranged to be relative to the position of the detection of the user indication.
6. The method of the claim 5, wherein the position of the at least one established I/O panel (140) on the at least one user interface surface (130) corresponds to the position of the detection of the user indication.
7. A control unit (1 10) of an elevator system, the control unit (1 10) is configured to: detect (310) a presence of an object in an operational area of at least one proximity sensor (210) based on data received from the at least proximity sensor (210), and generate a signal for activating (320) at least one user interface surface (130), the at least one user interface surface (130) being activated is selected in accordance with the at least one proximity sensor (210) from which the data generating a detection of the presence of the object in the operational area of the at least one proximity sensor (210) is received.
8. The control unit (1 10) of claim 7, the control unit (1 10) is further configured to: detect (330) a user indication provided through the at least one user interface surface (130), and generate (340) a signal causing an establishment of at least one I/O panel (140) on the at least one user interface surface (130) in accordance with the user indication provided through the at least one user interface surface (130).
9. The control unit (1 10) of claim 7 or 8, wherein the control unit (1 10) is configured to generate a signal causing an indication of an activation of the at least one user interface surface (130) with at least one of the following: visual signal, audible signal, haptic signal.
10. The control unit (1 10) of any of claims 7 - 9, wherein the control unit (1 10) is configured to perform a detection of the user indication from data provided through the at least one user interface surface (130) by: detecting of a touch on the at least one user interface surface (130), detecting a presence of a pointer within a distance threshold of the at least one proximity sensor (210).
1 1. The control unit (1 10) of any of claims 7 - 10, wherein the control unit (1 10) is configured to determine a position of the at least one established I/O panel (140) on the at least one user interface surface (130) to be relative to the position of the detection of the user indication.
12. The control unit (1 10) of claim 1 1 , wherein the control unit (1 10) is configured to determine the position of the at least one established I/O panel (140) on the at least one user interface surface (130) to correspond to the position of the detection of the user indication.
13. An elevator system comprising: at least one proximity sensor (210), at least one user interface surface (130), and a control unit (1 10) of claim 7.
14. The elevator system of claim 13, wherein the at least one user interface surface (130) is arranged to at least one of the following: an elevator car (150), a surface of a building where the elevator system resides.
15. A computer program which, when executed by at least one processor, cause a control unit (1 10) of an elevator system to perform the method according to any of claims 1 - 6.
PCT/FI2019/050063 2019-01-29 2019-01-29 User interface solution for elevator system WO2020157367A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/FI2019/050063 WO2020157367A1 (en) 2019-01-29 2019-01-29 User interface solution for elevator system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2019/050063 WO2020157367A1 (en) 2019-01-29 2019-01-29 User interface solution for elevator system

Publications (1)

Publication Number Publication Date
WO2020157367A1 true WO2020157367A1 (en) 2020-08-06

Family

ID=65324396

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2019/050063 WO2020157367A1 (en) 2019-01-29 2019-01-29 User interface solution for elevator system

Country Status (1)

Country Link
WO (1) WO2020157367A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200062538A1 (en) * 2018-08-21 2020-02-27 Otis Elevator Company Inferred elevator car assignments based on proximity of potential passengers

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3299323A1 (en) * 2016-09-23 2018-03-28 Otis Elevator Company Secondary car operating panel for elevator cars
EP3315443A1 (en) * 2016-10-26 2018-05-02 Otis Elevator Company Elevator virtual hall call panel systems and methods of operation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3299323A1 (en) * 2016-09-23 2018-03-28 Otis Elevator Company Secondary car operating panel for elevator cars
EP3315443A1 (en) * 2016-10-26 2018-05-02 Otis Elevator Company Elevator virtual hall call panel systems and methods of operation

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200062538A1 (en) * 2018-08-21 2020-02-27 Otis Elevator Company Inferred elevator car assignments based on proximity of potential passengers
US11554931B2 (en) * 2018-08-21 2023-01-17 Otis Elevator Company Inferred elevator car assignments based on proximity of potential passengers

Similar Documents

Publication Publication Date Title
EP3105160B1 (en) Elevator operator interface
EP3412613B1 (en) Hand detection for elevator operation
JP4312722B2 (en) Elevator call registration device
JP2011162307A (en) Destination floor registering device of elevator
CN112585075B (en) Elevator signaling device with adaptive visibility
EP3318524B1 (en) Destination dispatch passenger detection
US20220274803A1 (en) Operating method for an elevator operating device with a touch-sensitive screen system
WO2020157367A1 (en) User interface solution for elevator system
US20220106159A1 (en) Touchless Elevator User Interface
JP2018002428A (en) Car operation panel
EP3495300B1 (en) Elevator door system and a method for calling an elevator car
US20220197390A1 (en) Haptic feedback solution for one or more elevator systems
CN109019199B (en) Elevator system
CN109835784B (en) Elevator system
US20240051790A1 (en) Elevator button panel controller for providing contactless access to elevator button panel and method thereof
JPH11165969A (en) Elevator system
JP4519468B2 (en) Function guidance device
EP4238918A1 (en) Elevator system and elevator control method
CN117163783A (en) Non-contact button input device and method
US20230418536A1 (en) Display control device and display control method
JP7359324B1 (en) Elevator system, information processing device, information provision method, and computer-readable recording medium
KR102282333B1 (en) Gantry typed robot-elevator interface device
JP2013184791A (en) Destination floor registration device provided in elevator hall
WO2023058156A1 (en) Elevator operation device
CN116730133A (en) Elevator system and elevator control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19703769

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19703769

Country of ref document: EP

Kind code of ref document: A1