WO2022034260A1 - Controlling of elevator system - Google Patents

Controlling of elevator system Download PDF

Info

Publication number
WO2022034260A1
WO2022034260A1 PCT/FI2020/050707 FI2020050707W WO2022034260A1 WO 2022034260 A1 WO2022034260 A1 WO 2022034260A1 FI 2020050707 W FI2020050707 W FI 2020050707W WO 2022034260 A1 WO2022034260 A1 WO 2022034260A1
Authority
WO
WIPO (PCT)
Prior art keywords
elevator system
user interface
touchless
air
arrangement
Prior art date
Application number
PCT/FI2020/050707
Other languages
French (fr)
Inventor
Antti Perko
Goutham Saravana PANDIAN
Obulesu T
Original Assignee
Kone Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kone Corporation filed Critical Kone Corporation
Publication of WO2022034260A1 publication Critical patent/WO2022034260A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • B66B1/46Adaptations of switches or switchgear
    • B66B1/461Adaptations of switches or switchgear characterised by their shape or profile
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • B66B1/46Adaptations of switches or switchgear
    • B66B1/468Call registering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B2201/00Aspects of control systems of elevators
    • B66B2201/40Details of the change of control mode
    • B66B2201/46Switches or switchgear
    • B66B2201/4607Call registering systems
    • B66B2201/4638Wherein the call is registered without making physical contact with the elevator system

Definitions

  • the invention concerns in general the technical field of elevators. More particularly, the invention concerns controlling of elevator systems.
  • Elevator systems interact with users through user interfaces allowing input and output of information between the parties.
  • a typical example of such a user interface may be mentioned an elevator calling device, such as a car operating panel (COP) or a destination operating panel (DOP).
  • An interaction with the user interface is typically performed with a finger of the user when inputting e.g. service calls to the elevator system.
  • the user touches an area of the user interface corresponding to her/his need and the user interface generates an internal control signal in accordance with the input received from the user through the user interface.
  • Typical way to implement the user interface is either a panel with one or more buttons or a touch screen as static implementations.
  • One applicable technology for generating the user interface into air is so-called fog screen technology in which the screen is established with a water vapor technology onto which the user interface is projected to.
  • Another new technology for generating the user interface in air is based so called photophoretic optical trapping in which a projector generates an image in midair.
  • the photophoretic optical trapping is based on utilizing tiny physical particles in mid-air using special projection lenses.
  • the user interfaces generated in air may be considered as 2-dimensional or 3-dimensional.
  • the user interaction with the user interfaces generated in air may be based on a detection of user behavior with respect to the user interface.
  • the user behavior such as pointing to elements of the user interface with a hand or a finger
  • an applicable sensor such as an image capturing device, e.g. suitable for implementing stereographic imaging, which enables a determination of a position of the pointing means within the 3-dimensional space.
  • An object of the invention is to present an arrangement, a method, an elevator system, and a computer program for controlling an elevator system
  • an arrangement of an elevator system for controlling the elevator system comprising: at least one projector device for projecting a user interface in air; at least one image capturing device for capturing a number of images from a position the user interface is projected to; a touchless feedback device for providing touchless haptic feedback; a control unit configured to, in response to a receipt of the number of images from the image capturing device, perform: detect a pointing entity from image data received from the image capturing device; generate a control signal to the touchless feedback device for generating the touchless haptic feedback in response to a detection of the pointing entity from the image data.
  • the control unit may further be configured to control an operation of the at least one projector device for generating the user interface in air.
  • the user interface may comprise at least one object visible to a user, the object defining a position in the air to interact with the elevator system.
  • the control unit may also be configured to perform a detection of the pointing entity from the image data based on a detection that the pointing entity resides in the position corresponding to the at least one object visible to the user.
  • control unit may be configured to cause a generation of the touchless haptic feedback by generating vibrations in air.
  • the vibrations may be generated at frequencies more than 20 kHz.
  • a method for controlling an elevator system comprises: receiving a number of images from an image capturing device, the images representing a position of a user interface generated in air; detecting a pointing entity from image data received from the image capturing device; and generating a control signal to the touchless feedback device for generating the touchless haptic feedback in response to a detection of the pointing entity from the image data.
  • the method may comprise: controlling an operation of a projector device for generating the user interface in the air.
  • the user interface may comprise at least one object visible to a user, the object defining a position in the air to interact with the elevator system.
  • a detection of the pointing entity from the image data may be performed based on a detection that the pointing entity resides in the position corresponding to the at least one object visible to the user.
  • a generation of the touchless haptic feedback may be caused by generating vibrations in air.
  • the vibrations may be generated at frequencies more than 20 kHz.
  • an elevator system comprising an arrangement according to the first aspect as defined above is provided.
  • a computer program comprising computer readable program code configured to cause performing of the method according to the second aspect as defined above when said program code is run on one or more computing apparatuses.
  • a number of refers herein to any positive integer starting from one, e.g. to one, two, or three.
  • a plurality of refers herein to any positive integer starting from two, e.g. to two, three, or four.
  • Figure 1 illustrates schematically an elevator system according to an example.
  • Figure 2 illustrates schematically a touchless feedback device according to an example.
  • Figure 3 illustrates schematically a method according to an example.
  • Figure 4 illustrates schematically an apparatus according to an example.
  • Figure 1 illustrates schematically an example of an elevator system 100 comprising at least one elevator car 110 which may travel in a shaft under control of a drive 120.
  • the drive 120 may be controlled by an elevator controller 130.
  • the elevator controller 130 may be a central controlling entity of the elevator system 100 which may generate control signals to the drive and other elevator entities e.g. in accordance with service calls received e.g. persons visiting a building into which the elevator system 100 is implemented to.
  • the elevator system 100 may comprise other entities than the ones mentioned above.
  • the elevator system 100 comprises a plurality of doors, such as elevator car doors and hall doors, as well as signaling devices, such as gongs and similar, for providing information to persons intending to interact with the elevator system 100, such as use it.
  • Figure 1 illustrates schematically an arrangement for allowing interaction between the elevator system 100 and a user, such as a generation of service calls to the elevator system 100.
  • the arrangement may comprise a control unit 140, a projector device 150 and an image capturing device 170 and a touchless feedback device 180.
  • the control unit 140 may be arranged to control an operation of the mentioned entities.
  • the control unit 140 of the arrangement may be dedicated to the arrangement or it may also be arranged to perform other control operations.
  • the control operations of the control unit 140 of the arrangement may be implemented in the elevator controller 130.
  • the control unit 140 is a separate entity to the elevator controller 130 it may be arranged that the elevator controller 130 is a master device with respect to the control unit 140, which may operate as a slave device under a control of the elevator controller 130.
  • the projector device 150 may be a device configured to generate an image of a user interface 160 of the elevator system 100 in accordance with a control by the control unit 140.
  • the image of the user interface 160 may be called as a virtual user interface.
  • the user interface 160 may refer to any user interface of the elevator system 100 through which the person may interact with the elevator system as is described herein.
  • the user interface 160 may e.g. represent a car operating panel (COP) or a destination operating panel (DOP) or any other.
  • COP car operating panel
  • DOP destination operating panel
  • the projector device 150 is selected so that it is suitable for generating the user interface 160 in air by projecting a predefined image, or images, as 2- dimensional or 3-dimensional objects.
  • the predefined image may e.g.
  • the projection of the user interface 160 in air may be implemented by applying so-called photophoretic optical trapping which may be implemented by the projector device 150 e.g. under control of the control unit 140.
  • the projecting to the air may be achieved by establishing so-called fog screen in a desired location in the space and project the user interface 160 on the fog screen. Any other applicable technology for projecting the user interface 160 in the air may also be applied to.
  • the image representing the user interface 160 has a known location in a space defined by a coordination system defined in the control unit 140. This enables defining one or more areas of the user interface 160, or even volumes, in the coordinate system (cf. 2-dimensional or 3-dimensional).
  • the information relating to the location of the virtual user interface as well as the size and the shape of it, may be managed by the control unit 140.
  • the arrangement may further comprise at least one image capturing device 170 arranged to capture data for monitoring of the area, or the volume, of the user interface 160.
  • the image capturing device 170 may e.g. be a stereo camera configured to operate at a high frame rate in order to be suitable of obtaining a plurality of consecutive image frames, of the target.
  • the type of the image capturing device 170 may be selected in accordance to the implementation of the user interface 160. More specifically, the image capturing device 170 may be configured to monitor if there exists external entity within the monitored area, or volume, and to generate images thereof.
  • the external entity may refer to user’s hand 190, such as a finger, or any device suitable for pointing one or more objects of the user interface 160 generated in the air.
  • the image capturing device 170 may capture a plurality of images and deliver them to the control unit 140 for analysis.
  • the overall operation as described so far may be consider such that the user interface 160 is generated in the air for prompting the user to provide an input.
  • the user may react to this by bringing a pointing entity, such as a finger, towards the user interface 160 floating in the air and reaching the object matching his/her desire, such as a destination floor so as to indicate the elevator system 100 a service call.
  • the image capturing device 170 may be configured to obtain images, e.g. in response to a detection that a pointing entity enters an operational area, or volume, of the image capturing device, and start delivering the images to the control unit 140 through a communication channel established between the entities.
  • control unit 140 may be configured to analyse the received images 140, and detect the position of the pointing entity within the monitored area, or space, from the image data. This may be achieved by including the pointing entity, i.e. a tip of the pointing entity, into same coordination system in which the objects of the user interface 160 are defined, and determining the position of the pointing entity in the coordination system.
  • the control unit 140 may be arranged to detect a predefined input from the image data in order to detect if the user is willing to indicate a selection of an object of the user interface 160.
  • the predefined input may e.g.
  • control unit 140 may be arranged to perform a decision that the user wants to select the object corresponding to the position of the pointing entity in which the predefined input is detected.
  • control unit 140 may be arranged to generate a control signal in the elevator system 100 to provide service to the user in accordance with the selection performed by the user through the user interface 160.
  • control unit 140 may be arranged to generate and transmit such a control signal to the elevator controller 130 in case the entities are separate to each other, and the elevator controller 130 may then control respective entities, such as the elevator drive 120, in accordance with control data received from the control unit 140.
  • control unit 140 may be arranged to generate a control signal to a touchless feedback device 180 wherein the control signal comprise data causing the touchless feedback device 180 to generate in indication to the user on the interaction with the user interface 160, which may refer to a successful selection.
  • the indication generated by the touchless feedback device 180 is given by generating vibrations in air sensible in a haptic manner. In other words, the vibrations are generated in such a frequency that the recipient may sense the vibration haptically.
  • a non-limiting example of applicable frequencies for providing the touchless feedback may be frequencies over 20 kHz i.e. frequencies in ultrasound range.
  • the touchless feedback is provided with so-called ultrahaptic technology in which the applied frequency is around 40 kHz.
  • a further advantage to generate the vibration in the mentioned frequency range is that those frequencies are not typically within human auditory range.
  • the touchless feedback device 180 also referable as an ultrahaptic device, may be selected so that it is able to generate vibrations detectable even from a distance of one meter from the device 180.
  • it may be modulated with a low frequency signal, e.g. being within a frequency range of 0,4 to 500 Hz, so as to increase carrying power of the feedback signal.
  • the feedback given with the touchless feedback device 180 may be given in response to a detection a predefined input from image data received from the image capturing device 170.
  • the input may represent the selection of a predefined object from the virtual user interface 160, for example.
  • the touchless feedback device 180 is controlled to provide the haptic feedback in accordance with a type of detection made from the image data.
  • the arrangement may be configured to perform so that a first pattern of feedback is generated in response to a detection that the pointing entity, such as the user’s hand 160, may be detected from at least one of the obtained images which may be interpreted to correspond to a situation that the user’s hand resides within the area, or space, of the user interface 160.
  • a second pattern of feedback may e.g. be generated when the pointing entity, such as a fingertip, resides in a position corresponding to a selectable object through which it is possible to provide input to the elevator system 100.
  • a third pattern of feedback may be generated to provide an indication on a selection e.g. in response to a predefined input, such as the fingertip has been hold in the same position with applicable margins over a predefined period of time exceeding a threshold.
  • the patterns of the feedback may e.g. be differentiated from each other e.g. by applying different frequencies in the feedback, or e.g. by activating and deactivating the feedback in different manner in the different patterns. In the described manner it is possible to provide guidance to the user when the user is interacting with the elevator system 100 through the user interface.
  • FIG 2 illustrates schematically an example of a touchless feedback device 180 which may be applied to in the system as shown in Figure 1 .
  • the touchless feedback device 180 may comprise an array 210 of vibration transducers 220 by means of which the haptic feedback may be generated.
  • a respective controller of the touchless feedback device 180 may generate individual control signals to each of the vibration transducer 220 so as to cause activation of the vibration transducers 220 at different instants of time to establish time differences between each feedback signals.
  • a focal point which may e.g. correspond to the position the selection is made with the user interface 160.
  • a signal may be generated to the projector device to generate a concurrent visual effect in the same point in the user interface 160, for example. In this manner, the user may be information in effective manner on the selection made through the interaction.
  • FIG. 3 illustrates schematically an example of a method applicable in an elevator system 100 comprising an arrangement as described in the foregoing description.
  • a control unit 140 may control a generation of a user interface 160 in air by generating at least one control signal to a projector device 150.
  • the control unit 140 of the arrangement may be configured to generate a control signal to an image capturing device 170 for causing a capture of images from an area, or a space, into which the user interface 160 is generated.
  • the control unit 140 may be configured to analyse the image data and to detect 310 a pointing entity from at least one image.
  • the detection 310 may be generated in response to a detection that the pointing entity resides in a position corresponding to an object of the user interface 160 through which input may be given to the elevator system 100.
  • the detection 310 of the pointing entity may be generated in response to a detection that the pointing entity resides in an area, or a space, of the user interface 160.
  • the control unit 140 may be configured to generate 320 a control signal causing a touchless feedback device 180 to generate an output causing a touchless haptic detection by the user.
  • control unit 140 may refer to a computing device, such as a server device, a laptop computer, or a PC, as schematically illustrated in Figure 4.
  • Figure 4 illustrates schematically as a block diagram a non-limiting example of the control unit 140 applicable to perform the method in cooperation with other entities.
  • the block diagram of Figure 4 depicts some components of a device that may be employed to implement an operation of the control unit 140.
  • the apparatus comprises a processor 410 and a memory 420.
  • the memory 420 may store data and computer program code 425.
  • the apparatus may further comprise communication means 430 for wired and/or wireless communication with other entities, such as with at least one projector device 150, at least one image capturing device 170, a touchless feedback device 180, and an elevator controller 130.
  • I/O (input/output) components 440 may be arranged, together with the processor 410 and a portion of the computer program code 425, to provide a user interface for receiving input from a user, such as from a technician of the elevator system 100, and/or providing output to the user of the system when necessary.
  • the user I/O components may include user input means, such as one or more keys or buttons, a keyboard, a touchscreen, or a touchpad, etc.
  • the user I/O components may include output means, such as a display or a touchscreen.
  • the components of the apparatus may be communicatively coupled to each other via a bus 450 that enables transfer of data and control information between the components.
  • the memory 420 and a portion of the computer program code 425 stored therein may be further arranged, with the processor 410, to cause the apparatus, i.e. the device to perform a method as described in the foregoing description.
  • the processor 410 may be configured to read from and write to the memory 420.
  • the processor 410 is depicted as a respective single component, it may be implemented as respective one or more separate processing components.
  • the memory 420 is depicted as a respective single component, it may be implemented as respective one or more separate components, some or all of which may be integrated/removable and/or may provide permanent / semi-permanent / dynamic / cached storage.
  • the computer program code 425 may comprise computer-executable instructions that implement functions that correspond to steps of the method when loaded into the processor 410.
  • the computer program code 425 may include a computer program consisting of one or more sequences of one or more instructions.
  • the processor 410 is able to load and execute the computer program by reading the one or more sequences of one or more instructions included therein from the memory 420.
  • the one or more sequences of one or more instructions may be configured to, when executed by the processor 410, cause the apparatus to perform the method be described.
  • the apparatus may comprise at least one processor 410 and at least one memory 420 including the computer program code 425 for one or more programs, the at least one memory 420 and the computer program code 425 configured to, with the at least one processor 410, cause the apparatus to perform the method as described.
  • the computer program code 425 may be provided e.g. a computer program product comprising at least one computer-readable non-transitory medium having the computer program code 425 stored thereon, which computer program code 425, when executed by the processor 410 causes the apparatus to perform the method.
  • the computer-readable non-transitory medium may comprise a memory device or a record medium such as a CD-ROM, a DVD, a Blu-ray disc, or another article of manufacture that tangibly embodies the computer program.
  • the computer program may be provided as a signal configured to reliably transfer the computer program.
  • the computer program code 425 may comprise a proprietary application, such as computer program code for executing the control of the elevator system in the manner as described.
  • a functionality of the apparatus implementing the control unit 140 may be shared between a plurality of devices as a distributed computing environment.
  • the distributed computing environment may comprise a plurality of devices as schematically illustrated in Figure 4 arranged to implement the method in cooperation with each other in a predetermined manner.
  • each device may be arranged to perform one or more method steps and in response to a finalization of its dedicated step it may hand a continuation of the process to the next device.
  • the devices may e.g. be a control unit 140 and the elevator controller 130, for example.
  • Some aspects relate to an elevator system 100 comprising the arrangement as described in the foregoing description and wherein the method as described may be performed in order to control the elevator system accordingly.
  • An advantage of the examples as described is that they provide a sophisticated solution for interacting with the elevator system 100.
  • the solution provides a way to establish the user interface to an optimal position with respect to people flow in premises as well as enable touchless interaction with the elevator system which may e.g. prevent a spread of diseases.

Abstract

The present invention relates to an arrangement of an elevator system (100) for controlling the elevator system (100), the arrangement comprising: at least one projector device (150) for projecting a user interface (160) in air; at least one image capturing device (170) for capturing a number of images from a position the user interface (160) is projected to; a touchless feedback device (180) for providing touchless haptic feedback; a control unit (140) configured to, in response to a receipt of the number of images, perform: detect (310) a pointing entity from image data received from the image capturing device (170); generate (320) a control signal to the touchless feedback device (180) for generating the touchless haptic feedback in response to a detection of the pointing entity from the image data. The invention also relates to a method, an elevator system, and a computer program product.

Description

CONTROLLING OF ELEVATOR SYSTEM
TECHNICAL FIELD
The invention concerns in general the technical field of elevators. More particularly, the invention concerns controlling of elevator systems.
BACKGROUND
Elevator systems interact with users through user interfaces allowing input and output of information between the parties. A typical example of such a user interface may be mentioned an elevator calling device, such as a car operating panel (COP) or a destination operating panel (DOP). An interaction with the user interface is typically performed with a finger of the user when inputting e.g. service calls to the elevator system. In other words, the user touches an area of the user interface corresponding to her/his need and the user interface generates an internal control signal in accordance with the input received from the user through the user interface. Typical way to implement the user interface is either a panel with one or more buttons or a touch screen as static implementations.
Recently, there are introduced solutions enabling a creation of a user interface into air. One applicable technology for generating the user interface into air is so-called fog screen technology in which the screen is established with a water vapor technology onto which the user interface is projected to. Another new technology for generating the user interface in air is based so called photophoretic optical trapping in which a projector generates an image in midair. The photophoretic optical trapping is based on utilizing tiny physical particles in mid-air using special projection lenses. The user interfaces generated in air may be considered as 2-dimensional or 3-dimensional. The user interaction with the user interfaces generated in air may be based on a detection of user behavior with respect to the user interface. The user behavior, such as pointing to elements of the user interface with a hand or a finger, may be monitored with an applicable sensor, such as an image capturing device, e.g. suitable for implementing stereographic imaging, which enables a determination of a position of the pointing means within the 3-dimensional space.
In principle, the above described solutions for implementing the user interface in the air and detecting user behavior with respect to the user interface are applicable in elevator systems. However, in a context of the elevator systems there is need to introduce solutions by means of which it is possible to provide response to the user as regards to the interaction with the user interface.
SUMMARY
The following presents a simplified summary in order to provide basic understanding of some aspects of various invention embodiments. The summary is not an extensive overview of the invention. It is neither intended to identify key or critical elements of the invention nor to delineate the scope of the invention. The following summary merely presents some concepts of the invention in a simplified form as a prelude to a more detailed description of exemplifying embodiments of the invention.
An object of the invention is to present an arrangement, a method, an elevator system, and a computer program for controlling an elevator system
The objects of the invention are reached by an arrangement, a method, an elevator system, and a computer program as defined by the respective independent claims.
According to a first aspect, an arrangement of an elevator system for controlling the elevator system is provided, the arrangement comprising: at least one projector device for projecting a user interface in air; at least one image capturing device for capturing a number of images from a position the user interface is projected to; a touchless feedback device for providing touchless haptic feedback; a control unit configured to, in response to a receipt of the number of images from the image capturing device, perform: detect a pointing entity from image data received from the image capturing device; generate a control signal to the touchless feedback device for generating the touchless haptic feedback in response to a detection of the pointing entity from the image data.
The control unit may further be configured to control an operation of the at least one projector device for generating the user interface in air.
Further, the user interface may comprise at least one object visible to a user, the object defining a position in the air to interact with the elevator system.
The control unit may also be configured to perform a detection of the pointing entity from the image data based on a detection that the pointing entity resides in the position corresponding to the at least one object visible to the user.
Alternatively or in addition, the control unit may be configured to cause a generation of the touchless haptic feedback by generating vibrations in air. For example, the vibrations may be generated at frequencies more than 20 kHz.
According to a second aspect, a method for controlling an elevator system is provided, the method, performed by an apparatus, comprises: receiving a number of images from an image capturing device, the images representing a position of a user interface generated in air; detecting a pointing entity from image data received from the image capturing device; and generating a control signal to the touchless feedback device for generating the touchless haptic feedback in response to a detection of the pointing entity from the image data.
Further, the method may comprise: controlling an operation of a projector device for generating the user interface in the air. Further, the user interface may comprise at least one object visible to a user, the object defining a position in the air to interact with the elevator system.
A detection of the pointing entity from the image data may be performed based on a detection that the pointing entity resides in the position corresponding to the at least one object visible to the user.
Alternatively or in addition, a generation of the touchless haptic feedback may be caused by generating vibrations in air. For example, the vibrations may be generated at frequencies more than 20 kHz.
According to a third aspect, an elevator system comprising an arrangement according to the first aspect as defined above is provided.
According to a fourth aspect, a computer program is provided, the computer program comprising computer readable program code configured to cause performing of the method according to the second aspect as defined above when said program code is run on one or more computing apparatuses.
The expression "a number of’ refers herein to any positive integer starting from one, e.g. to one, two, or three.
The expression "a plurality of’ refers herein to any positive integer starting from two, e.g. to two, three, or four.
Various exemplifying and non-limiting embodiments of the invention both as to constructions and to methods of operation, together with additional objects and advantages thereof, will be best understood from the following description of specific exemplifying and non-limiting embodiments when read in connection with the accompanying drawings.
The verbs “to comprise” and “to include” are used in this document as open limitations that neither exclude nor require the existence of unrecited features. The features recited in dependent claims are mutually freely combinable unless otherwise explicitly stated. Furthermore, it is to be understood that the use of “a” or “an”, i.e. a singular form, throughout this document does not exclude a plurality.
BRIEF DESCRIPTION OF FIGURES
The embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings.
Figure 1 illustrates schematically an elevator system according to an example.
Figure 2 illustrates schematically a touchless feedback device according to an example.
Figure 3 illustrates schematically a method according to an example.
Figure 4 illustrates schematically an apparatus according to an example.
DESCRIPTION OF THE EXEMPLIFYING EMBODIMENTS
The specific examples provided in the description given below should not be construed as limiting the scope and/or the applicability of the appended claims. Lists and groups of examples provided in the description given below are not exhaustive unless otherwise explicitly stated.
Figure 1 illustrates schematically an example of an elevator system 100 comprising at least one elevator car 110 which may travel in a shaft under control of a drive 120. The drive 120 may be controlled by an elevator controller 130. The elevator controller 130 may be a central controlling entity of the elevator system 100 which may generate control signals to the drive and other elevator entities e.g. in accordance with service calls received e.g. persons visiting a building into which the elevator system 100 is implemented to. The elevator system 100 may comprise other entities than the ones mentioned above. For example, the elevator system 100 comprises a plurality of doors, such as elevator car doors and hall doors, as well as signaling devices, such as gongs and similar, for providing information to persons intending to interact with the elevator system 100, such as use it.
Further, Figure 1 illustrates schematically an arrangement for allowing interaction between the elevator system 100 and a user, such as a generation of service calls to the elevator system 100. The arrangement may comprise a control unit 140, a projector device 150 and an image capturing device 170 and a touchless feedback device 180. The control unit 140 may be arranged to control an operation of the mentioned entities. The control unit 140 of the arrangement may be dedicated to the arrangement or it may also be arranged to perform other control operations. In some examples, the control operations of the control unit 140 of the arrangement may be implemented in the elevator controller 130. In case the control unit 140 is a separate entity to the elevator controller 130 it may be arranged that the elevator controller 130 is a master device with respect to the control unit 140, which may operate as a slave device under a control of the elevator controller 130.
The projector device 150 may be a device configured to generate an image of a user interface 160 of the elevator system 100 in accordance with a control by the control unit 140. The image of the user interface 160 may be called as a virtual user interface. The user interface 160 may refer to any user interface of the elevator system 100 through which the person may interact with the elevator system as is described herein. The user interface 160 may e.g. represent a car operating panel (COP) or a destination operating panel (DOP) or any other. The projector device 150 is selected so that it is suitable for generating the user interface 160 in air by projecting a predefined image, or images, as 2- dimensional or 3-dimensional objects. The predefined image may e.g. be stored in data storage, such as in an internal memory of the control unit 140 or the projector device 150 wherefrom the image may be retrieved e.g. in response to a control signal generated by the control unit 140. The projection of the user interface 160 in air may be implemented by applying so-called photophoretic optical trapping which may be implemented by the projector device 150 e.g. under control of the control unit 140. In another example, the projecting to the air may be achieved by establishing so-called fog screen in a desired location in the space and project the user interface 160 on the fog screen. Any other applicable technology for projecting the user interface 160 in the air may also be applied to.
In general, the image representing the user interface 160, and any elements of it, has a known location in a space defined by a coordination system defined in the control unit 140. This enables defining one or more areas of the user interface 160, or even volumes, in the coordinate system (cf. 2-dimensional or 3-dimensional). The information relating to the location of the virtual user interface as well as the size and the shape of it, may be managed by the control unit 140.
The arrangement may further comprise at least one image capturing device 170 arranged to capture data for monitoring of the area, or the volume, of the user interface 160. The image capturing device 170 may e.g. be a stereo camera configured to operate at a high frame rate in order to be suitable of obtaining a plurality of consecutive image frames, of the target. The type of the image capturing device 170 may be selected in accordance to the implementation of the user interface 160. More specifically, the image capturing device 170 may be configured to monitor if there exists external entity within the monitored area, or volume, and to generate images thereof. For example, the external entity may refer to user’s hand 190, such as a finger, or any device suitable for pointing one or more objects of the user interface 160 generated in the air. In other words, the image capturing device 170 may capture a plurality of images and deliver them to the control unit 140 for analysis.
The overall operation as described so far may be consider such that the user interface 160 is generated in the air for prompting the user to provide an input. The user may react to this by bringing a pointing entity, such as a finger, towards the user interface 160 floating in the air and reaching the object matching his/her desire, such as a destination floor so as to indicate the elevator system 100 a service call. The image capturing device 170 may be configured to obtain images, e.g. in response to a detection that a pointing entity enters an operational area, or volume, of the image capturing device, and start delivering the images to the control unit 140 through a communication channel established between the entities.
In accordance with an example the control unit 140 may be configured to analyse the received images 140, and detect the position of the pointing entity within the monitored area, or space, from the image data. This may be achieved by including the pointing entity, i.e. a tip of the pointing entity, into same coordination system in which the objects of the user interface 160 are defined, and determining the position of the pointing entity in the coordination system. The control unit 140 may be arranged to detect a predefined input from the image data in order to detect if the user is willing to indicate a selection of an object of the user interface 160. The predefined input may e.g. be a certain gesture detectable from the image data or a detection that the pointing entity is held in the same position in the user interface 160, possibly with a predefined margin, a time exceeding a threshold time set for the selection. As a result, the control unit 140 may be arranged to perform a decision that the user wants to select the object corresponding to the position of the pointing entity in which the predefined input is detected.
In response to the decision performed by the control unit 140 based on a predefined rule, or a set of rules, the control unit 140 may be arranged to generate a control signal in the elevator system 100 to provide service to the user in accordance with the selection performed by the user through the user interface 160. For example, the control unit 140 may be arranged to generate and transmit such a control signal to the elevator controller 130 in case the entities are separate to each other, and the elevator controller 130 may then control respective entities, such as the elevator drive 120, in accordance with control data received from the control unit 140. In addition to the generation of the control signal in the elevator system 100 the control unit 140 may be arranged to generate a control signal to a touchless feedback device 180 wherein the control signal comprise data causing the touchless feedback device 180 to generate in indication to the user on the interaction with the user interface 160, which may refer to a successful selection. The indication generated by the touchless feedback device 180 is given by generating vibrations in air sensible in a haptic manner. In other words, the vibrations are generated in such a frequency that the recipient may sense the vibration haptically. A non-limiting example of applicable frequencies for providing the touchless feedback may be frequencies over 20 kHz i.e. frequencies in ultrasound range. In some preferred implementation the touchless feedback is provided with so-called ultrahaptic technology in which the applied frequency is around 40 kHz. A further advantage to generate the vibration in the mentioned frequency range is that those frequencies are not typically within human auditory range. The touchless feedback device 180, also referable as an ultrahaptic device, may be selected so that it is able to generate vibrations detectable even from a distance of one meter from the device 180. In the context of directing and transferring the feedback signal as described to the destination, it may be modulated with a low frequency signal, e.g. being within a frequency range of 0,4 to 500 Hz, so as to increase carrying power of the feedback signal.
As briefly mentioned above the feedback given with the touchless feedback device 180 may be given in response to a detection a predefined input from image data received from the image capturing device 170. The input may represent the selection of a predefined object from the virtual user interface 160, for example. In some other examples, it may be arranged that the touchless feedback device 180 is controlled to provide the haptic feedback in accordance with a type of detection made from the image data. For example, the arrangement may be configured to perform so that a first pattern of feedback is generated in response to a detection that the pointing entity, such as the user’s hand 160, may be detected from at least one of the obtained images which may be interpreted to correspond to a situation that the user’s hand resides within the area, or space, of the user interface 160. Further, a second pattern of feedback may e.g. be generated when the pointing entity, such as a fingertip, resides in a position corresponding to a selectable object through which it is possible to provide input to the elevator system 100. Still further, a third pattern of feedback may be generated to provide an indication on a selection e.g. in response to a predefined input, such as the fingertip has been hold in the same position with applicable margins over a predefined period of time exceeding a threshold. The patterns of the feedback may e.g. be differentiated from each other e.g. by applying different frequencies in the feedback, or e.g. by activating and deactivating the feedback in different manner in the different patterns. In the described manner it is possible to provide guidance to the user when the user is interacting with the elevator system 100 through the user interface.
Figure 2 illustrates schematically an example of a touchless feedback device 180 which may be applied to in the system as shown in Figure 1 . The touchless feedback device 180 may comprise an array 210 of vibration transducers 220 by means of which the haptic feedback may be generated. In order to provide the feedback in a position the indication of the user selection is given a respective controller of the touchless feedback device 180 may generate individual control signals to each of the vibration transducer 220 so as to cause activation of the vibration transducers 220 at different instants of time to establish time differences between each feedback signals. In this manner it is possible to make the signals arrive at the same time at the same point called as a focal point, which may e.g. correspond to the position the selection is made with the user interface 160. In addition to this, a signal may be generated to the projector device to generate a concurrent visual effect in the same point in the user interface 160, for example. In this manner, the user may be information in effective manner on the selection made through the interaction.
Figure 3 illustrates schematically an example of a method applicable in an elevator system 100 comprising an arrangement as described in the foregoing description. In step 310 a control unit 140 may control a generation of a user interface 160 in air by generating at least one control signal to a projector device 150. In addition, the control unit 140 of the arrangement may be configured to generate a control signal to an image capturing device 170 for causing a capture of images from an area, or a space, into which the user interface 160 is generated. The control unit 140 may be configured to analyse the image data and to detect 310 a pointing entity from at least one image. In an example implementation, the detection 310 may be generated in response to a detection that the pointing entity resides in a position corresponding to an object of the user interface 160 through which input may be given to the elevator system 100. In some other example embodiment, the detection 310 of the pointing entity may be generated in response to a detection that the pointing entity resides in an area, or a space, of the user interface 160. In response to the detection 310 the control unit 140 may be configured to generate 320 a control signal causing a touchless feedback device 180 to generate an output causing a touchless haptic detection by the user.
For example, the control unit 140 may refer to a computing device, such as a server device, a laptop computer, or a PC, as schematically illustrated in Figure 4. Figure 4 illustrates schematically as a block diagram a non-limiting example of the control unit 140 applicable to perform the method in cooperation with other entities. The block diagram of Figure 4 depicts some components of a device that may be employed to implement an operation of the control unit 140. The apparatus comprises a processor 410 and a memory 420. The memory 420 may store data and computer program code 425. The apparatus may further comprise communication means 430 for wired and/or wireless communication with other entities, such as with at least one projector device 150, at least one image capturing device 170, a touchless feedback device 180, and an elevator controller 130. Furthermore, I/O (input/output) components 440 may be arranged, together with the processor 410 and a portion of the computer program code 425, to provide a user interface for receiving input from a user, such as from a technician of the elevator system 100, and/or providing output to the user of the system when necessary. In particular, the user I/O components may include user input means, such as one or more keys or buttons, a keyboard, a touchscreen, or a touchpad, etc. The user I/O components may include output means, such as a display or a touchscreen. The components of the apparatus may be communicatively coupled to each other via a bus 450 that enables transfer of data and control information between the components.
The memory 420 and a portion of the computer program code 425 stored therein may be further arranged, with the processor 410, to cause the apparatus, i.e. the device to perform a method as described in the foregoing description. The processor 410 may be configured to read from and write to the memory 420. Although the processor 410 is depicted as a respective single component, it may be implemented as respective one or more separate processing components. Similarly, although the memory 420 is depicted as a respective single component, it may be implemented as respective one or more separate components, some or all of which may be integrated/removable and/or may provide permanent / semi-permanent / dynamic / cached storage.
The computer program code 425 may comprise computer-executable instructions that implement functions that correspond to steps of the method when loaded into the processor 410. As an example, the computer program code 425 may include a computer program consisting of one or more sequences of one or more instructions. The processor 410 is able to load and execute the computer program by reading the one or more sequences of one or more instructions included therein from the memory 420. The one or more sequences of one or more instructions may be configured to, when executed by the processor 410, cause the apparatus to perform the method be described. Hence, the apparatus may comprise at least one processor 410 and at least one memory 420 including the computer program code 425 for one or more programs, the at least one memory 420 and the computer program code 425 configured to, with the at least one processor 410, cause the apparatus to perform the method as described.
The computer program code 425 may be provided e.g. a computer program product comprising at least one computer-readable non-transitory medium having the computer program code 425 stored thereon, which computer program code 425, when executed by the processor 410 causes the apparatus to perform the method. The computer-readable non-transitory medium may comprise a memory device or a record medium such as a CD-ROM, a DVD, a Blu-ray disc, or another article of manufacture that tangibly embodies the computer program. As another example, the computer program may be provided as a signal configured to reliably transfer the computer program.
Still further, the computer program code 425 may comprise a proprietary application, such as computer program code for executing the control of the elevator system in the manner as described.
Any of the programmed functions mentioned may also be performed in firmware or hardware adapted to or programmed to perform the necessary tasks.
Moreover, as mentioned a functionality of the apparatus implementing the control unit 140 may be shared between a plurality of devices as a distributed computing environment. For example, the distributed computing environment may comprise a plurality of devices as schematically illustrated in Figure 4 arranged to implement the method in cooperation with each other in a predetermined manner. For example, each device may be arranged to perform one or more method steps and in response to a finalization of its dedicated step it may hand a continuation of the process to the next device. The devices may e.g. be a control unit 140 and the elevator controller 130, for example.
Some aspects relate to an elevator system 100 comprising the arrangement as described in the foregoing description and wherein the method as described may be performed in order to control the elevator system accordingly.
An advantage of the examples as described is that they provide a sophisticated solution for interacting with the elevator system 100. The solution provides a way to establish the user interface to an optimal position with respect to people flow in premises as well as enable touchless interaction with the elevator system which may e.g. prevent a spread of diseases.
The specific examples provided in the description given above should not be construed as limiting the applicability and/or the interpretation of the appended claims. Lists and groups of examples provided in the description given above are not exhaustive unless otherwise explicitly stated.

Claims

WHAT IS CLAIMED IS:
1. An arrangement of an elevator system (100) for controlling the elevator system (100), the arrangement comprising: at least one projector device (150) for projecting a user interface (160) in air, at least one image capturing device (170) for capturing a number of images from a position the user interface (160) is projected to, a touchless feedback device (180) for providing touchless haptic feedback, a control unit (140) configured to, in response to a receipt of the number of images from the image capturing device (170), perform: detect (310) a pointing entity from image data received from the image capturing device (170), generate (320) a control signal to the touchless feedback device (180) for generating the touchless haptic feedback in response to a detection of the pointing entity from the image data.
2. The arrangement of the elevator system (100) of claim 1 , the control unit (140) is further configured to control an operation of the at least one projector device (150) for generating the user interface (160) in air.
3. The arrangement of the elevator system (100) of any of the preceding claims, wherein the user interface (160) comprises at least one object visible to a user, the object defining a position in the air to interact with the elevator system (100).
4. The arrangement of the elevator system (100) of any of preceding claims, wherein the control unit (140) is configured to perform a detection (310) of the pointing entity from the image data based on a detection that the pointing entity resides in the position corresponding to the at least one object visible to the user.
5. The arrangement of the elevator system (100) of any of preceding claims, wherein the control unit (140) is configured to cause a generation of the touchless haptic feedback by generating vibrations in air.
6. The arrangement of the elevator system (100) of claim 5, wherein the vibrations are generated at frequencies more than 20 kHz.
7. A method for controlling an elevator system (100), the method, performed by an apparatus (140), comprising: receiving a number of images from an image capturing device (170), the images representing a position of a user interface (160) generated in air, detecting (310) a pointing entity from image data received from the image capturing device (170), and generating (320) a control signal to the touchless feedback device (180) for generating the touchless haptic feedback in response to a detection of the pointing entity from the image data.
8. The method of claim 7, the method further comprising: controlling an operation of a projector device (150) for generating the user interface (160) in the air.
9. The method of any of the preceding claim 7 or 8, wherein the user interface (160) comprises at least one object visible to a user, the object defining a position in the air to interact with the elevator system (100).
10. The method of any of preceding claims 7-9, wherein a detection (310) of the pointing entity from the image data is performed based on a detection that the pointing entity resides in the position corresponding to the at least one object visible to the user.
11 . The method of any of preceding claims 7-10, wherein a generation of the touchless haptic feedback is caused by generating vibrations in air. 17
12. The method of any of preceding claims 11 , wherein the vibrations are generated at frequencies more than 20 kHz.
13. An elevator system (100) comprising an arrangement according to any of claims 1 -6.
14. A computer program comprising computer readable program code configured to cause performing of the method according to any of claims 7 to 12 when said program code is run on one or more computing apparatuses.
PCT/FI2020/050707 2020-08-14 2020-10-27 Controlling of elevator system WO2022034260A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FIPCT/FI2020/050530 2020-08-14
FI2020050530 2020-08-14

Publications (1)

Publication Number Publication Date
WO2022034260A1 true WO2022034260A1 (en) 2022-02-17

Family

ID=72243155

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2020/050707 WO2022034260A1 (en) 2020-08-14 2020-10-27 Controlling of elevator system

Country Status (1)

Country Link
WO (1) WO2022034260A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210380370A1 (en) * 2020-06-30 2021-12-09 Nouveau National LLC Handsfree elevator control system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005113399A1 (en) * 2004-04-30 2005-12-01 Otis Elevator Company Haptic hologram-enabled elevator call buttons
US20080291156A1 (en) * 2007-05-23 2008-11-27 Dietz Paul H Sanitary User Interface
US20150199011A1 (en) * 2014-01-14 2015-07-16 Microsoft Corporation Attractive and repulsive force feedback
US10691397B1 (en) * 2014-04-22 2020-06-23 sigmund lindsay clements Mobile computing device used to operate different external devices

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005113399A1 (en) * 2004-04-30 2005-12-01 Otis Elevator Company Haptic hologram-enabled elevator call buttons
US20080291156A1 (en) * 2007-05-23 2008-11-27 Dietz Paul H Sanitary User Interface
US20150199011A1 (en) * 2014-01-14 2015-07-16 Microsoft Corporation Attractive and repulsive force feedback
US10691397B1 (en) * 2014-04-22 2020-06-23 sigmund lindsay clements Mobile computing device used to operate different external devices

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210380370A1 (en) * 2020-06-30 2021-12-09 Nouveau National LLC Handsfree elevator control system
US11738970B2 (en) * 2020-06-30 2023-08-29 Upward Technology Llc Handsfree elevator control system

Similar Documents

Publication Publication Date Title
US10052147B2 (en) Touch free operation of ablator workstation by use of depth sensors
JP5779641B2 (en) Information processing apparatus, method, and program
US9111326B1 (en) Designation of zones of interest within an augmented reality environment
US9595172B2 (en) Dataglove having tactile feedback and method
KR20150002786A (en) Interacting with a device using gestures
JP2015043154A (en) Information processing device, control method therefor, computer program, and storage medium
KR101019254B1 (en) apparatus having function of space projection and space touch and the controlling method thereof
JPH11283026A (en) Touch pad provided with fingerprint detection function, and information processor
CN103929603A (en) Image Projection Device, Image Projection System, And Control Method
KR20180053402A (en) A visual line input device, a visual line input method, and a recording medium on which a visual line input program is recorded
KR20180006133A (en) Electronic device and operating method thereof
JP2015197822A (en) Tactile sense control device, tactile sense control method, and program
US11068108B2 (en) Input device
CN109254658A (en) Tactile feedback method, haptic feedback devices and touch display unit
JP2011209579A (en) Image display system and control method thereof
WO2022034260A1 (en) Controlling of elevator system
US20230085751A1 (en) Controlling of elevator system
CN106774815B (en) Touch gestures determine method and device
CN107567609A (en) For running the method for input equipment, input equipment, motor vehicle
JP2014534525A (en) Pressure-based interaction for indirect touch input devices
US8866870B1 (en) Methods, apparatus, and systems for controlling from a first location a laser at a second location
CN113498029B (en) Interactive broadcast
JP2014170367A (en) Object detection device, object detection method, object detection system and program
KR102169236B1 (en) Touchscreen device and method for controlling the same and display apparatus
WO2020157367A1 (en) User interface solution for elevator system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20799764

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20799764

Country of ref document: EP

Kind code of ref document: A1